So… I’ve not been impressed with Sage Accounts 2015 so far. First they block installation of the software on Small Business Server 2011 and I had to call to get the information on how to get it to actually install.
Secondly I was told that you can no longer use a mapped drive as your location for Sage files, this causes problems with the backup method now? I had to go around each computer and replace the z:\COMPANY.XXX path with \\fileserver\sageaccounts\COMPANY.XXX !
Then… I find the server has massively slowed down and at least one of the client computers are dog slow too.
In the case of the server it seems to start a scheduled backup using the windows Temp folder as a scratch area and never completed it, abandoned it(?) and then did the same the next day. This, after a while, filled up the server hard disk.
Sage Scheduled backups (a new feature in this version of Sage as far as I know) seems to use the “c:\windows\temp\YYYY-MM-DD HH-MM-SS” folder structure to stage it’s backup. It seems to, when working correctly, copy the sage company files into the temp folder and then compress them into c:\SageBackups and delete the Windows Temp folder. (The server wasn’t managing to complete the backup so the temp files just accumulated).
I’ve solved this one simply by turning off the inbuilt Sage backup off.. the server already has a robust and regular “versioned” backup onto external hard disks.
The problem I can’t fathom out so far is the one of the client computers on the network. The symptoms are constant hard disk activity and the machine being awfully slow. There is no sign in Task Manager that the cause might be Sage… It isn’t using a huge amount of RAM or CPU.
The only way I found out was using Process Monitor from Sysinternals / Microsoft.
Process Monitor shows sg50svc.exe copying random crap, such as Google Earth and some Chrome files from the users AppData folder into the “c:\windows\temp\YYYY-MM-DD HH-MM-SS” folder! What?! Anyway.. end the process and the machine speeds up and everything is normal again, until the next day when it starts doing the same thing again.
I checked that the scheduled backup option was off and, again, it is.
Sage have been contacted with a video showing this madness.. watch this space. It seems their new scheduled backup option and code needs some work!
Did you get any response from Sage?
We are also having the same issue. I am thinking the easiest resolution is to turn off the new scheduled backups feature centrally via group policy registry or something, but cannot find the settings required.
Rang Sage about this issue, this morning it’s a well-known problem. For anyone else experiencing it you need to just open any company data go to file then scheduled backups change it to ‘Back up and check data off’ they don’t have an off option merely just an option to remove automated schedules! Please be aware that if you run this on a Citrix or TS environment it’s an unsupported issue and may continue to backup automatically regardless Sage have no fix for this as it’s unsupported and merely confirm it as an issue…
P.S. If you do this for one data set I’ve been assured it does them for all for that machine. But you will need to do this per machine for where Sage is installed.
We had the same issue with the Sage Accounts scheduled backups.
Its the same in V22 (2016).
I work for a company that hosts remote desktops for clients. We have Sage Accounts installed on Terminal Servers and the data service on a file server.
To disable the scheduled backups we put a group policy to delete sg50Svc_v22.sqlite-shm and -wal files from the windows service folder location on file server computer startup.
ie C:\Program Files (x86)\Sage\AccountsServiceV22 for Sage Accounts 2016.
The Sage Accounts services needs to be stopped before the files can be deleted.
This isn’t an official fix but it seems to work for us.
Thank you for the information! I’m amazed they don’t see this as quite an urgent problem. I attirbute at least one SSD early life failure at one of my customers to the insane amount of disk IO that the bug creates.
Hi David, Does that fix also work for 2015? I’m assuming I just subtact the _22 part. We’re finding it still runs for Citrix/TS environments as we feared (god Sage sucks).
Yeah the fix also works for 2015.
Do your users have a set company list which doesn’t change? eg they have 3 sets of company data and these rarely change.
The fix needs to be ran on the same device where the data is stored.
If your data is held on the terminal server then ran on the terminal server, or if the data is a file server it will need to be applied there.
Its best to redirect all data to the file server and apply the fix there.
Remember this isn’t an official fix, but it works for our needs.
Hi David, It’s a Citrix enviroment with 5 XA’s and all data is stored Centrally on the file server, The odd time an accountant brain farts and put’s it on the C: drive. Does this policy need to run every time (does the file regenerate?) or can you simply navigate to the folder and delete it?
The file does re-generate.
There is a better way to configure Sage Accounts in the first place.
On the file server, create blank folders based on how many licences you have.
ie 50 licences = create COMPANY.001-050
In each folder place a blank sub folder called ACCDATA
Populated the COMPANY file with these locations and set to read only.
Ask your users not to add/remove any companies from Sage, but use the existing list.
A blank company location is named *** NO DATA FOUND ***
Double click to open ready to create a new company or restore data to.
The file path is then already forced on the user.
If all slots are filled then all licences are used. Ask the user to overwrite an existing company.
Many thanks for finding and posting this. Its been giving us a headache for a few days.
Sage really need to get this sorted.
Thank you for the thanks… Glad it helped. It took me ages to work out and Sage didn’t seem interested when I reported the problem to them (even with video, screenshots etc.. of the thing copying files unrelated to Sage).
I have a file server where all the Sage data is stored, and a bunch of client PCs with sage installed on them that access the data on the sage file server. I am seeing a LOT of files being read and written to and from c:\windows\temp\ on the FILE SERVER.
Since the clients don’t have access to the file server’s c:\windows\temp\ folder, what on the file server is generating all these temp files and how can I stop it?
I see people saying that you can disable the scheduled backup in the sage software, but won’t that just stop these backups from running on their local computers and have no effect on what’s going on on the file server?
I believe that more recent versions of Sage run a Sage Data service on the server. Your server will probably be running a Sage service.. If you open sage on the server and check the backup settings it should sort the problem.
BUT.. I’m not sure if changing the backup settings on the server affects the client computers.
Thanks for the thought! Indeed Sage does run some data services on the file server for some of the latest versions of Sage, however only the data services are installed on the server, not the actual client software, so I don’t know how to change the backup settings on the server. Do you know of any INI file edit or any other way to do this? (Maybe the data service picks up the backup settings from each client that is stored on the file server and uses that flag to initiate the backup process on the server, hmm, I will have to test!)
The Sage Accounts backups are configured from the client.
If you open one of the client data sets held on that server you will see backup options for all client data sets stored on that same server that are in your company list on that client computer.
We normally just turn off the scheduled backups and use our own backup tool or manual backups.
In V21/2015 there are limited options, and in V22/2016 you can specify the server drive letter to place the backups. But still puts the folder in the root of the drive.
I will see if I can find time at some point to processmonitor Sage to see what it does when you change backup schedule settings.
David Casey, so if the backup options are set on the client, but the data is stored on the file server, then the file server is the device that runs the backups? I am seeing a lot of sage-related reading/writing to the FILE SERVER’s c:\windows\temp\ folder, which doesn’t make sense to me (I would have thought the backup process would run on each client’s PC, locally to their c:\windows\temp\ folder, not the file server’s c:\windows\temp\ folder)
thecomputerperson, any help gratefully received!
The configuration is done from the client but the service runs the backups on the file server.
All settings stored in a mysql file or similar.
For Sage Accounts 2016/V22 its called sg50Svc_v22.sqlite-shm and -wal file.
in C:\Program Files (x86)\Sage\AccountsServiceV22 for Sage Accounts 2016 on the file server.
It uses the windows temp location to copy the client files first then backup.
We had a similar issue due a overnight backup running and locking these files.
Yeh this sucks. We have automatic backups turned off in the client. And just the data service installed on the file server, but it’s continuing to make backups to c:\windows\temp and slow everything down every morning. Deleting the SHL and WAL files – they just recreate.
How to we stop it permanently?
May be a bit of a drastic measure but might work…
Create a new administrator user.
Set the permissions on the windows temp folder to deny for that specific user.
Set the Sage service to run as that user.
Restart the service – hope that it still works but doesn’t have the file access permissions to be able to attempt to write the backups :D
I’ve not tested this – it is just an idea. Don’t blame me if the world explodes.
Thanks for this – confirmed our suspicions. It seems Sage indiscriminately copies the ENTIRE company directory to the Windows Temp directory. For us, this was 30GB+ because we had stored our (manual) BACKUPS directory inside the company folder. We have now moved this outside, leaving us with 650MB which I suspect will be more manageable.
Copying the entire 650MB folder in order to create an 8MB backup file seems like massive overkill, but there we go…
Back from the dead but we discovered this issue on one of our legacy terminal servers yesterday. The customer is running Sage 2017 and was generating an average of 1700 IOPS on the back end storage. Changing the permissions on Sg50Svc_v24.sqllite3 to deny System seems to have resolved the issue.