I've seen many SQL configurations which would seem optimal at first glance but the disk is working constantly and performance is not very good, those servers could have been a lot better if one thing is set.
The SQL Maximum Server Memory.
I think I've read somewhere that it could be calculated, some think it should be set at the server's physical memory, but in my opinion and in my experience, no effort is needed, just a fresh restart, wait for everything to load, and take a look in Available memory in the task manager, round it down and no more paging.
You should take that advice with a grain of salt if your server is being used for other purposes other than SQL server as other service might take more memory after a while.
For example, If you have IIS on the same server, check how many application pools you have, check how much memory they take after 24 hours and subtract that from your available memory.
The way you set that is in the server properties in Microsoft SQL Server Management Studio.
If you want to know if your server is under or overusing the memory, you can estimate the required memory by checking how big the tables data and indexes are, if its lower than the physical memory you've allocated to the SQL server, it should be good, it means that tables and index access can be cached in memory.
The more memory you have, the more SQL can cache data, indexes and execution plans.
Friday, February 12, 2010
Subscribe to:
Post Comments (Atom)
0 comments:
Post a Comment