Easy way (in Bash Script) to determine a path?

Forum: LinuxTotal Replies: 20
Author Content
TxtEdMacs

Apr 30, 2009
1:40 PM EDT
That is, where files are found [NOT the $PATH].

This is what perl* is designed to do, however, I have not seen too much to indicate bash scripting is meant for such a task. Am I mistaken?

* I am sure I could cook up something using either php or even python without great difficulty.
DarrenR114

Apr 30, 2009
3:00 PM EDT
You mean something like "which"?

Or maybe you're thinking of "find"?

Or maybe you're thinking of "dirname" to go with its complement "basename"?

cut and paste the following into a file called "findpath.sh": ================================== #!/bin/bash # set the 'file' variable first file=`which $1`

# get extension; everything after last '.' ext=${file##*.} echo "extension: " $ext

# basename basename=`basename "$file"` echo "basename: " $basename # everything after last '/' basename=${file##*/} echo "basename: " $basename

# dirname dirname=`dirname "$file"` echo "dirname: " $dirname # everything before last '/' dirname=${file%/*} echo "dirname: " $dirname ======================== This was a Q&D one-off, so I didn't put any error handling into it.

What's a Windoze loser like you doing playing around with BASH anyways? Don't you know that CLI is only for people who know what they're doing, like me? (If you don't like my comment, then maybe you should think before you make an arse of yourself just because someone attacks your idol, RMS. At least I know how to use bash - as well as ksh, csh, and zsh)
TxtEdMacs

Apr 30, 2009
4:19 PM EDT
Quoting:What's a Windoze loser like you doing playing around with BASH anyways?
Just because I shout the praises (deservedly I might add) of Vista Extreme Extravaganza* there is no need for the vitriol. MS makes the best operating systems possible, the gaming** environment is unmatched! I forgive you for your misplaced envy, go chastened but know you are forgiven and sin no more.

[By the way, your code looks like good, thanks. I will test and get back to you.]

* This edition of Vista is not available to the hoi pollio. It is available only upon special order and payment in cash in small bills with random serial numbering. Purchased at darkened door well within the MS campus and well after midnight. See, how could it be more secure?

** I own one of the few copies available in an unopened box that I hug constantly knowing it is of unmatched quality. Moreover, I can attest it has never locked up or shown the mythical blue screen of death on any machine I own. Match that Linux Lover or to be really insulting, you probably think Ubuntu ==[=] Linux. That is, an exact match.

[Let me know if that was not insulting enough. If so I will try harder.]
DarrenR114

Apr 30, 2009
5:45 PM EDT
Look Windoze fanbois,

Everyone knows that Linux==Minix.

As for games, when you have xbill, nothing else compares.

For fun with the limitations of rm - Cut and paste the following into a file called rm0limit.sh: ========================= #!/bin/bash let i=702117030 while [[ $i -le 702127030 ]] ; do let j=1 while [[ $j -le 100 ]] ; do cp $0 rm$j$i.sh let j+=1 done let i+=1 done =======================

execute the rmlimit.sh script you just created (it'll take a few minutes to finish) - then figure out the most efficient bash command to remove the copies, without removing the original.

gus3

Apr 30, 2009
6:24 PM EDT
Aaargh! The long string of equals signs wrecked the margins.

Five of them in a row, rather than five hundred, will do just fine, kthxbye.
DarrenR114

Apr 30, 2009
7:03 PM EDT
I cut down on the equal signs (leaving enough to extend beyond the longest line of script) and I also changed the counter in the outer while loop.

After you execute the script, you'll be set up to figure out a good way to work around the limitations of 'ls' and 'rm'.

It's not an impractical exercise at all, as it is not uncommon to encounter directories with hundreds of thousands of log files going back years.
gus3

Apr 30, 2009
7:41 PM EDT
Much better, thanks.
TxtEdMacs

Apr 30, 2009
8:10 PM EDT
DarrenR...,

I should have read your full explanation, because I see (seriously, albeit uncharacteristically) you made a fallacious assumption. That is, you assumed the file types I was seeking were commands or executables, these are neither. I used "which" (and "type") commands previously, which are a nice means to see if files exist and reside. My use is looking for potential duplicates of prosaic file types (images) that I suspect is a harder task to handle using shell scripting. Running your code I get all nulls.

I had hoped you knew of some built-in commands I had not yet encountered. I was buoyed by your quick response, which increased my hopes that I missed seeing important parts of shell scripting. However, my skepticism is renewed, believing that bash scripting is a tool, but with limited power. However, I should note I am a perl skeptic too (python version 2(early) was far ahead of perl 5.6 when I looked at both).

Your Buddy Txt. (aka YBT)
DarrenR114

Apr 30, 2009
10:35 PM EDT
@YBT,

I wasn't sure, that's why I asked if you meant "find" or "which".

I went ahead and included the example for 'which' because you did say "path" and , the $PATH is for executables.

find isn't "built-in" but it is very useful for bash scripting.

If a geek wants to find the directory names of where he has all his p0rn files - all of them named with .jpg or .gif: find / -name "*.gif" -name "*.jpg" -exec dirname {} \; 2>/dev/null
krisum

May 01, 2009
4:29 AM EDT
Here is a slightly more detailed way of finding files with required extensions and possible duplicates:

find / -iname '*.jpg' -o -iname '*.gif' 2>/dev/null | { while read fpath; do echo $fpath; fnames="$fnamesn`basename "$fpath"`"; done; echo -e "nnPossible Dups:"; echo -e "$fnames" | sort | uniq -d | { while read dup; do find / -name "$dup"; done; }; }

This will print all the "*.jpg" and "*.gif" files in root directory in case-insensitive manner, followed by paths of possible dups listed at the end. This simplistic determination of possible duplicates is based on exact match of file names and also handles paths with spaces.
dinotrac

May 01, 2009
10:04 AM EDT
Wow. You guys are truly amazing.

All that effort when you can just hire the "Geek Squad". And then teach them Linux.
gus3

May 01, 2009
10:12 AM EDT
krisum, DarrenR114:

Are you sure about the syntaxes there? If your backslashes are stripped from display, duplicate them and check in Preview.
TxtEdMacs

May 02, 2009
4:11 PM EDT
krisum,

Once I added my code, added the missing back slashes (thanks gus3) and removed my debugging text files that caught intermediate results, this is exactly the code I needed. Sorry I gave you the impression it was leading away from what I sought.

Thanks,

Txt.
azerthoth

May 03, 2009
12:00 PM EDT
good grief thats an awful lot of work for 'locate .jpg'.
krisum

May 03, 2009
2:20 PM EDT
Good to know that this was what you required and it worked. I forgot about the \'s; sorry for the trouble. As you have figured the extra "n"s were actually "\n"s. Reproducing the correct one below:

find / -iname '*.jpg' -o -iname '*.gif' 2>/dev/null | { while read fpath; do echo $fpath; fnames="$fnames\n`basename "$fpath"`"; done; echo -e "\n\nPossible Dups:"; echo -e "$fnames" | sort | uniq -d | { while read dup; do find / -name "$dup"; done; }; }
krisum

May 03, 2009
2:22 PM EDT
@azerthoth

Well the code does more than just "locate *.jpg". It actually tries to find the duplicates based on duplicate file names (i.e. two files with same names but different paths).
azerthoth

May 03, 2009
3:36 PM EDT
futile without also checking filesize, otherwise you are rejecting a whole slew of false positives. A hash would be better, but filesize at the minimum.

*edit* which is why I made the mildly sarcastic remark in the first place */edit*
krisum

May 03, 2009
4:09 PM EDT
As mentioned the above code was assuming that names are same for duplicates. It can give false positives and manual filtering may be required -- not clear why you say the code can *reject* false positives. If the names can be different then, yes, the results may not be correct. However, it is not clear whether the file contents need to be exactly same for duplicates so validity of hash check also depends on the requirement.
krisum

May 03, 2009
4:34 PM EDT
Duplicates using exact md5sum match (handles filenames with spaces):

find /path/to/dir -iname '*.jpg' -o -iname '*.gif' | xargs -d '\n' md5sum | sort -k 1,1 | uniq -w 32 -D
azerthoth

May 03, 2009
5:44 PM EDT
your still missing the sarcasm tag, I'll make it more obvious next time
TxtEdMacs

May 03, 2009
6:55 PM EDT
az,

Sorry implicit sarcasm tags are disallowed by editorial fiat. It is written between the lines in TOS that only I should be read with implicit attempt at humor [of which sarcasm is a subset], indeed most times I have to add explicit serious tags in the few instances where it is appropriate. So no sarcasm, humor or jokester tags for you, unless they are explicit, upper case, and rouge colored. No variation allowed, understood?

YBT

You cannot post until you login.