PDA

View Full Version : FTP Client that allows me to look in large directories



Morgenmuffel
09-09-2009, 04:07 PM
Hi

Problem is this

Client has been adding images to a directory, there are now 5000ish images in the directory, what i want to do is find all those over a certain size and download them and then resize and resample them, I can't really run a script on the server as there are a couple of files that need to be large (although i could possibly replace these), when i try and view the directory through cpanel the "a script on the page has become unresponsive" error happens repeatedly, if I click continue it keeps going a little then the error happens again and again till it finally crashes, if I click cancel it returns the results to that point,

When using other methods - FireFTP truncates files in a directory to 2000 , and my ancient dreamweaver truncates to even less.

Does anyone have any experience with an ftp client that won't truncate at 2000 records, preferably free and not cuteftp as my trial appears to have expired and I don't know that it would do this anyway

Thanks

Sweep
09-09-2009, 04:17 PM
Try this one.

http://filezilla-project.org/

Morgenmuffel
09-09-2009, 04:18 PM
OK 2000 seems to be a server limit, damn

hueybot3000
09-09-2009, 04:25 PM
why not download the whole directory and do it that way? Or use server side software to resize them all? Do you have mysql etc available?

Morgenmuffel
09-09-2009, 04:39 PM
why not download the whole directory and do it that way? Or use server side software to resize them all? Do you have mysql etc available?
The directory is extremely large, and as previously mentioned there are files in the directory that shouldn't be resized, and I can only tell what these files are by reading their names, downloading everything is plan Z

sal
09-09-2009, 07:39 PM
Sounds like you're gonna need to download the directory at some point to resize and review, so rather than view the directories contents, drag it to your desktop and sort it out locally.

Morgenmuffel
09-09-2009, 09:48 PM
I was hoping there was some kind of command I could run in the ftp client

sort of like

list all files over 100K

I tried

Find -size +100000

But I'm guessing that only works on linux

robbyp
09-09-2009, 09:55 PM
OK 2000 seems to be a server limit, damn

Whos the web host? It's not one of these cheap rubbishy ones is it? A 2000 file limit displayed sounds like a server problem, as FTP programs can list many files, and I have never encountered that problem before.

vinref
10-09-2009, 07:23 AM
Are you able to only download the directory listing, instead of the complete directory itself, with your FTP client? You can then sort through the list, find the names(s) of the appropriate file(s), and then download directly. I have never encountered the 2000 file list limit.

I use libcurl on *nix, and it can do this, but I do not recall a Windows version.

sal
10-09-2009, 07:40 AM
If you have PHP on that server, this script might be some use (http://paste2.org/p/420044). You could tweak it to be more helpful depending, dunno.

somebody
10-09-2009, 07:45 AM
You could probably do all the resizing server-side with a PHP script (that way you don't have to download anything), and just code into that script exemptions so that the files you want left alone are ignored. Someone like Erayd would be the best person to ask about how to achieve this.

Morgenmuffel
10-09-2009, 10:02 AM
Ok I have had a brain wave what If I used Sals scripts but instead of outputting to a list,
I used it to generate dummyfiles with the same name (in a different directory of course),
downloaded these dummy files which would be tiny, and then put them in the images directory
Then highlighted them and selected download remote version (or whatever the equivalent command is)
Which should over write the dummy files with the large ones off the server.

or does that sound to convuloted, should be fairly easy though?

Morgenmuffel
10-09-2009, 10:20 AM
ok help

how long would it take me to download 600 mega bytes

I just checked on the consumer speed test

and it says my download speed is 9.86mbps and upload speed is 0.16mbps
the download speed seems awfully high to me

anyway me calculations ended up around 8 hours to download sound about right?

probably will take longer as the day kicks on

KarameaDave
10-09-2009, 10:42 AM
600 MB = 4800 Mb divided by 9.86Mb/s would equal 487 seconds...
However
This does not take into account tcp/ip overheads and of course assumes
the server will dish it out at your max speed.
When I download Linux iso files of around 600MB on 4Mb/s
they typically take between 20 minutes and an hour.

Morgenmuffel
10-09-2009, 10:53 AM
ok using filezilla, is there a way to stop me downloading the subdirectories, as the subdirectories don't show up in the directory listings

as in I just want to download the contents of the images folder, not the subdirectory s within this folder

Erayd
10-09-2009, 10:59 AM
Do you have shell access to the server by any chance?

Which OS is the server running?

Morgenmuffel
10-09-2009, 11:05 AM
Do you have shell access to the server by any chance?

Which OS is the server running?
only access through cpanel

Erayd
10-09-2009, 11:08 AM
only access through cpanel
Ouch.

Which OS is the server running?

Morgenmuffel
10-09-2009, 11:28 AM
Ouch.

Which OS is the server running?

no idea kernel version is 2.6.18-6-xen-686


I am thinking downloading is the best option right now, the main prob is a large subdirectory full of images that don't need rsizing or resampling.

just can't work out a way to exclude this subdirectory

actually there is a ticket on the filezilla developers thingie that says this has been done, but I can't figure out how to do it

Ok I think the filezilla answer must be here
http://wiki.filezilla-project.org/Filename_Filters

I always seem to search the wrong terms

Erayd
10-09-2009, 11:46 AM
no idea kernel version is 2.6.18-6-xen-686That means Linux (in a Xen VM), and judging by that version string I'd say it's Debian, most likely Etch (4.0), although possibly Lenny (5.0).

If all you're wanting is a directory listing of files over a certain size, you can indeed use the find command, as referenced in an earlier post. The trick is to run that command from PHP, rather than via the FTP shell - find is not a valid FTP command.

Use the PHP passthru() (http://nz.php.net/manual/en/function.passthru.php) function to execute your command and spit the result out to the browser.

Does this solve your problem, or have I missed what you're trying to do?

Morgenmuffel
10-09-2009, 08:27 PM
Ok cheers all, I have used Sals script, and set it to copy all files over a certain size in the images directory into another directory, then download those files, resize and resample them, then reupload them to the images directory to overwrite the large versions

Convuloted I know but I found I couldn't download the whole directory.

so thank all