lftp large files being access

Discussion in 'Linux Networking' started by jcharth, May 28, 2013.

  1. jcharth

    jcharth Guest

    Hello I am using lftp to backup my webserver. I would rather use rsync but I can get really cheap sftp storage. I am worried about backing up files that are in use like log files. Is there an option to backup this files usinglftp? Can I load them in memory or skip large files? What is the best way to do this? May be backup locally and sync local backup to remote server?
     
    jcharth, May 28, 2013
    #1
    1. Advertisements

  2. jcharth

    Jorgen Grahn Guest

    What does that mean? To me, "sftp" implies ssh, and rsync works over
    ssh. It seems to me it would be stupid of a storage provider to
    implement sftp support, but disable rsync-over-ssh.
    What worries you in particular? What do you *want* to happen?
    /Jorgen
     
    Jorgen Grahn, May 28, 2013
    #2
    1. Advertisements

  3. jcharth

    jcharth Guest

    I have log files that are 800+MB and they change constantly. Would rsync handle this files better than lftp?. My network connection is gigabit. So I could rsync files to a local machine and then ftp them to the cloud. The connection to the cloud is slow. If it takes 20 minutes to ftp a large file. The log file might be locked for to long.
     
    jcharth, May 29, 2013
    #3
  4. jcharth

    Jorgen Grahn Guest

    Are you responding to my posting? If so, please quote the relevant
    parts, and reply to that one instead of your own posting. I won't
    make the extra effort to dig up my posting and try to match it with
    yours -- and neither do others, I suspect.

    /Jorgen
     
    Jorgen Grahn, May 29, 2013
    #4
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.