Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations SkipVought on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Cloud backup for VFP 9 application?

Status
Not open for further replies.

jongoad

Programmer
Oct 12, 2007
11
0
0
US
The company I work for is interested in offering some kind of cloud backup solution to the users of our software. We'd want to keep it simple, and only backup the application and data files for our specific software application (written in Visual FoxPro SP2 and uses a VFP database). If possible, we'd like to be able to rebrand/white-label the backup product so it appears as if they're using another product from us, rather than from some third-party provider. I was curious if anyone else is offering a similar product/service to your customers, and if so, what that product is and how it's working out for you.

In the past, we've totally left it up to our individual customers to handle their own backups. However, many of our customers are very small, and they often don't have any IT staff or even any tech-savvy users in the organization. So they may or may not have good backups. But of course, when their server dies or they get hit with a virus, malware, etc., they call us needing help. We're then forced to search for a good backup (which may not exist at all, or is sometimes several months old), meaning they can potentially lose several months worth of work and financial data for their customers or at worst--all of their data.

Any input would be appreciated.

Thanks,

Jon
 
It looks a bit like done in VFP, but does it help Jon, really?

I think the easiest way to have both a strong cloud background and something you can brand yourself or keep to yourself is go for one of the major clouds Microsoft, Amazon or Google. Create your backup SaaS offer there to only use it yourself in your own desktop software.

Edit: A second best option could be writing an app for Dropbox or Google Drive or OneDrive, but you wouldn't let your software collect their data into your drive, so that would require them to have one of these as their cloud storage solution. as SaaS vendor, you can make use of cloud file space and databases, that are not part of such a drive implementation and you wouldn't implement all the drive capabilities, you only need to let each customer get back at his data via your application as the frontend to whatever you use as storage.

I'd opt for something letting you keep data of customers separated in a way you could even give them access to their own portion of backups only and then we're rather back at core cloud usage.
Azure has Blob storage, AWS has Glacier and Google names it Cloud Storage. In all cases, they call that bucket and you can create as many buckets and put them into a specific region (eg for GDPR compliance) and maybe simply create one per customer rather than only one for your application.

Of course you could even host the data as database in the cloud, but it's a good consideration to keep data local and create backups with a local endpoint as perhaps an encrypted ZIP archive you then upload to a cloud storage, so while the core data is relational database, you'd still only use blob storage services, you don't need table and record access, you upload and download whole backup archive files. APIs go as simple as specifying bucket name and file to upload or download. Really simplistic.

Bye, Olaf.
 
Hi Tamar, I e-mailed Rick about CleverFox, but they don't offer re-branding, which is on our list of requirements.

Hi Olaf, what you're suggesting sounds similar to a third-party solution that we've been researching. They offer a backup client allows re-branding, and it also allows you to choose your own storage (Amazon, Azure, Google Cloud, etc.). Their product seems to be tailed more to managed service providers than software companies, but it may prove to be the way to go for us based on our requirements.

Thanks,

Jon
 
If your likely back-up size if going to be less than 2GB, you could quite easily write something in VFP - anything bigger than that
is likely to be a problem!

Regards

Griff
Keep [Smile]ing

There are 10 kinds of people in the world, those who understand binary and those who don't.

I'm trying to cut down on the use of shrieks (exclamation marks), I'm told they are !good for you.
 
Well, the natural file size limit, yes. But that's per file and a ZIP archive of all data can be larger, as you can use many things besides Craig Boyds FLL, eg command line interface of (64bit) 7ZIP, to also create larger archives.

I'm even not sure the FLL has that limit, just because it's extending VFP and runs within the 32bit process. You can use API to create larger files, too, even natively, I showed that in thread184-1613571, look at the end of it. The reason to limit DBF to 2GB are 4byte pointers and indeed the maximum of 2GB RAM for any 32bit process is a limit of what can be loaded at a single time anyway.

Bye, Olaf.
 
The FLL is limited to 2GB zip files, not sure if that is VFP or that it was written before the common format for larger zip files was prevalent - but it is real (and very slightly frustrating).
I use Craig's FLL every day, I think it is brill. I wish I could tell how large the zip file was without writing it out to HDD, and that the 2GB limit wasn't there, but brill nonetheless.

You are quite right you could use something else to create a zip file.

Regards

Griff
Keep [Smile]ing

There are 10 kinds of people in the world, those who understand binary and those who don't.

I'm trying to cut down on the use of shrieks (exclamation marks), I'm told they are !good for you.
 
OK, good to know.

Using 7zip works, 2GB isn't a zip file limit and the external process can do whatever it needs to create larger zips, if necessary.
Any implementation will use zlib and that uses a buffer for zipping, something like 8k memory by default, it doesn't keep a while zip file in memory. You want to generate a zip file anyway, there's no reason to keep it all in memory and then finally write out, same goes for extracting, in the end, you want files.

Just tested zipping 2 iso files and the resulting zip is 4.5 GB and can be processed by 7zip again and even by windows native zip capability.

You can do (for example): RUN /N "C:\Program Files\7-Zip\7z.exe" a D:\test\isofiles.zip D:\test\isofiles\*
Command reference is in the 7zip help.

Bye, Olaf.

 
Just wanting to add my 2 cents. I have been at this software thing for over 25 years and one of the hardest learned lessons was in the area of providing backups for customers. I JUST DON'T DO IT. With providing the complete backup solution comes with it a non-verbal agreement that I AM RESPONSIBLE FOR THE DATA. Lots can go wrong with backups and customers do not notice when things go wrong, like a failed backup for the last 6 months. You can justify till you are blue in the face why it is not your issue and it just doesn't matter. The data is gone and they need to point at someone. I recommend backup solutions and will even set it up for them at my billable rate but the Backup is their backup not mine. I learned this at the cost of a very good long standing customer and I ended up rebuilding best i could from what they had and spent bunches of hours that to the customer where un-billable. Lesson learned, there are many ways for a software developer to make money, But for me providing backups is not one of them.



Steve Bowman
Independent Technology, Inc.
CA, USA
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top