Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5

Database Backups

#1
I'm posting this on here in the hope that someone can help.

I've got a fairly big project with a lot of info that I need to ensure doesn't disappear in the event of a server failure. Currently running a VPS that has all of the application files as well as the database and everything's been running smoothly. We're using an external S3-like service for the hosting of files which has been great and is very cheap to run, and the application itself is all on GitHub so don't have to really worry about losing the files.

However, I'm becoming increasingly more paranoid about the database in case anything happens as it's getting quite large quite quick. Does anyone have any experience using any backup services? I've got a basic rclone of a backup directory at the moment which is doing the job but I don't really trust myself (I'm not the most knowledgeable about command-line things and I'm getting a bit out of my depth).

I've had previous hosting providers that have used JetBackup for cPanel backups and I've had a look today and they offer a Linux standalone version which might be useful for us (it's only around $6/month) but has anyone got any other suggestions?

Thanks!
WebDevZone - A new, friendly web development community
Reply
#2
I'm gonna guess you're using DigitalOcean or similar? If so, we just moved to DO, been happy so far, though we've had a few hiccups. Otherwise we've been happy.

We're looking into a solution for the exact same thing as a matter of fact. I generally do file backups manually once every month or so (we're a forum, the only thing that really changes regularly on the files is the avatars and attachments. Finna) The database is something I back up about every two days or so, and we've had to do it manually. I've got about 100 of them sitting on my computer right now, and they add up quick.

I will say that, although host-made backups are great, they can screw you over if your host goes under or your account gets suspended. I've heard of this happening all too many times. Perhaps host-made backups are good for daily backups, but at least every couple weeks an offsite backup should be made in the event that a worst-case scenario happens. Having your backups and your data on the same host or company can be a disaster if they decide to suspend you because of a phony abuse report from some bad actors, or if an act of God takes down the datacenter.

So yea, we're pretty much looking into solving the exact same problem. In the past, I used shell scripts to do this sort of thing for a fairly large corporate company that was running on AWS. We had about 12 EC2 and RDS instances combined, and it worked very well on keeping everything in sync. We actually were using shell scripts for files only, but I don't see any reason that something of this sort couldn't be done for databases too. Cron can run them automatically and use SSH/SCP to automatically move them to offsite locations.

I'll be sure to share whatever I come up with. By then you might have a solution that works even better. Finna

Reply
#3
I have news - I signed up for the JetBackup trial and it only supports CentOS. Sad We're running on Ubuntu so back to the drawing board.

I'm currently with Scaleway actually and they've been great so far (it was cheaper than DO to beef it up a bit to deal with the traffic) so we stayed but from what I recall from DO, they do offer a snapshot/backup service. I've just had a look and Scaleway offer something similar but is it too much to ask to just back my database up? Crying

They do offer a managed SQL service which does have automated backups but I feel like it may be overkill because as a whole, everything is perfectly fine so far on this one.

Either way, I'm still looking. At the moment I'm using Percona xtrabackup which I'm compressing and storing off-site but the file size before compression is about 250MB and after is 2MB so this is concerning me. I'm not sure if it's just very compressable data or something is going wrong, hence this quest.
WebDevZone - A new, friendly web development community
Reply
#4
You could always try to export the backed up data after the compression to see. What format are they using for the compression?

If it's gzip, 250MB -> 2MB is fishy and hard to believe. LZMA might be a lot more feasible with that sort of thing though. I took a 10MB SQL file and tried using 7z on it, and it went from 10MB -> 360KB. Pretty surprising honestly.

Reply
#5
HY, so I have been using MYSql dumpers since my WBB time ... crons are only controlled by a Pl file, but I am still satisfied. Unfortunately, since 2018 Mysql dumpers have been discontinued


 
[Image: autism4all.png]
[x] <= Drive in nail here for new display!
Reply
#6
Maybe it's just not a lot of data. I'm at the point where I might just write a system to do it myself.

I did email JetBackup and they replied within minutes but didn't appear to read what I'd wrote which isn't a great start. Canned replies drive me insane.
WebDevZone - A new, friendly web development community
Reply




Users browsing this thread: 1 Guest(s)

Dark/Light Theme Selector

Contact Us | Makestation | Return to Top | Lite (Archive) Mode | RSS Syndication 
Proudly powered by MyBB 1.8, © 2002-2024
Forum design by Makestation Team © 2013-2024