andamira Posted February 28, 2014 Report Share Posted February 28, 2014 A couple of weeks ago I made a minimal disk write speed test in Icarus and saved the result: 23:02 0 andamira@icarus$ dd if=/dev/zero of=test bs=64k count=16k conv=fdatasync && rm -f test 1073741824 bytes (1,1 GB) copied, 108,887 s, 9,9 MB/sFor comparison this is the result for the same test in another server I've an account on: andamira@710c5:~$ dd if=/dev/zero of=test bs=64k count=16k conv=fdatasync && rm -f test 1073741824 bytes (1,1 GB) copiados, 6,50854 s, 165 MB/sI wondered what the reason for such difference may be but since the websites are very responsive in Icarus I didn't worry at all, but remained curious. And this thread seems to answer the question. Quote Link to comment Share on other sites More sharing options...
Michael D. Posted February 28, 2014 Report Share Posted February 28, 2014 Andamira - all of our cPanel accounts have been limited to 10 megabytes/second for about the last year which is a perfectly reasonble limit from the standpoint that if you think about it - a full HD 1080P stream only uses 0.9 to 1.1 megabytes/second [upwards of 9 MBPS]. We're not streaming 1080P HD - we're streaming text files [html, php, js, css] and images. Some do have larger images and media but again 10 megabytes/second is plenty for even that unless the site super busy at which point it would have other issues before it ran out of I/O. What this limit does allow us to do is ensure stability. Say, for example, you decide to archive/compress 5 GB worth of data within your account - that process will get limited which will ensure that everybody else on the server has disk I/O available for actively serving their site/pages and resolving their MySQL queries. The average site uses anywhere from a few kilobytes per second up to 0.5~0.6 megabytes/second. The really image/media heavy ones tend to use only upwards of 5 megabytes/second even at their highest spikes. 10 megabytes/second is plenty for serving hosting content by a large margin. Quote Link to comment Share on other sites More sharing options...
andamira Posted February 28, 2014 Author Report Share Posted February 28, 2014 Thank you Mike, that leaves me with a couple of questions of a more private nature, so I'll send you a PM. Quote Link to comment Share on other sites More sharing options...
Brad Posted February 28, 2014 Report Share Posted February 28, 2014 Interesting. I was creating database backups a few days ago and thought to myself that it's probably high time I download a complete backup... I realize from your perspective this is a different animal of course. I could live with a weekly backup schedule. After all, we'll still have the ability to generate our own at any time. A weekly schedule better distributes the owness to both hoster and client I think. Quote Link to comment Share on other sites More sharing options...
Michael D. Posted February 28, 2014 Report Share Posted February 28, 2014 If you can download at 10 megabytes/second [80 megabit] you're luckier than most as that's nearly 10% of total server connectivity [not even thinking about disk I/O]. That said - this limit does not apply to FTP [i set the limit on my personal account to 50 kilobytes/second and was able to download a 1GB file at 57 megabit [limit of my connection]. Quote Link to comment Share on other sites More sharing options...
Michael D. Posted February 28, 2014 Report Share Posted February 28, 2014 I'm splitting off an unrelated discussion from the R1Soft thread about Disk I/O into this thread. Quote Link to comment Share on other sites More sharing options...
Brad Posted February 28, 2014 Report Share Posted February 28, 2014 Yup, I get about 58 to 78 but I'm in no hurry. I just start it when I go to bed and just let it run as long as it takes on a spare pc. A bit inconvenient since I'm running my download speed full speed for about 20 hours in order to download 8gb of content. But there are resourceful ways of minimizing what needs to be backed up. My user content upload folders don't change that often so excluding them from the backup would halve the size of a complete backup. What I'd love is something that would only backup what has changed or does not exist. I tried that with Filezilla but I didn't like the way it ran. It seemed that even the check/compare process was taking too long... Quote Link to comment Share on other sites More sharing options...
Michael D. Posted February 28, 2014 Report Share Posted February 28, 2014 Use FTP - it does not have this same limit and you can download at the maximum speed of your connection. If you're already doing it via FTP then you're not being limited in your downloads. That said - the limit for semi-dedicated defaults to 20 megabytes/second or 160 MBPS. Quote Link to comment Share on other sites More sharing options...
Brad Posted February 28, 2014 Report Share Posted February 28, 2014 Ah ok. I actually tested the R1soft backup manager download process for five minutes just before posting my previous post which gave me the results I posted. So for me R1soft and FTP both give me the max throughput that my ISP is provding. Do you happen to know of some kind of app/client that could backup only changes on an account? Quote Link to comment Share on other sites More sharing options...
Michael D. Posted February 28, 2014 Report Share Posted February 28, 2014 Data from R1Soft comes from a completely different and dedicated backup server - limits on your hosting server would not apply . You can use rsync for files but as far as databases - I do not know. Quote Link to comment Share on other sites More sharing options...
Brad Posted February 28, 2014 Report Share Posted February 28, 2014 Oh ok. I'll check that out. My databases are a synch to work with so having a files solution only would be perfect. Thanx. http://wiki.r1soft.com/display/TP/rsync+Backup Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.