Skip to content

Measuring memory usage in Windows 7

by Brandon on February 21st, 2010

Historically, measuring the amount of memory in use by a Windows system has been a somewhat confusing endeavor.  The labels on various readouts in Task Manager, among other places, were often either poorly named or simply misunderstood.  I’ll tackle a prime example of this, the “commit” indicator, later in this post.  But first, let’s look at a simple way to measure the amount of physical memory in use on your system.

In Windows 7, the folks building the Task Manager performance tab tried to make it a little easier to understand the usage of physical memory on your system.  The most interesting bits are here:


What do these values tell us?

– We are looking at a machine with 4GB of physical memory installed.

– 71% of that physical memory is currently in use by applications and the system.

– That leaves 29% of memory “available”, despite the indication that only 16MB of physical memory is totally “free.”

Here’s a description of the four labels, from the bottom:

Free – This one is quite simple.  This memory has nothing at all in it.  It’s not being used and it contains nothing but 0s.

Available – This numbers includes all physical memory which is immediately available for use by applications.  It wholly includes the Free number, but also includes most of the Cached number.  Specifically, it includes pages on what is called the “standby list.”  These are pages holding cached data which can be discarded, allowing the page to be zeroed and given to an application to use.

Cached – Here things get a little more confusing.  This number does not include the Free portion of memory.  And yet in the screenshot above you can see that it is larger than the Available area of memory.  That’s because Cached includes cache pages on both the “standby list” and what is called the “modified list.”  Cache pages on the modified list have been altered in memory.  No process has specifically asked for this data to be in memory, it is merely there as a consequence of caching.  Therefore it can be written to disk at any time (not to the page file, but to its original file location) and reused.  However, since this involves I/O, it is not considered to be “Available” memory.

Total – This is the total amount of physical memory available to Windows.

Now, what’s missing from this list?  Perhaps, a measurement of “in use” memory.  Task Manager tells you this in the form of a percentage of Total memory, in the lower right-hand corner of the screenshot above.  71%, in this case.  But how would you calculate this number yourself?  The formula is quite simple:

Total – Available = Physical memory in use (including modified cache pages).

If you plug in the values from my screenshot above, you’ll get:

4026MB – 1150MB = 2876MB

This matches up with the 71% calculation.  4026 * .71 = 2858.46MB.

Recall that this number includes the modified cache pages, which themselves may not be relevant if you are trying to calculate the memory “footprint” of all running applications and the OS.  To get that number, the following formula should work

Total – (Cached + Free) = Physical memory in use (excluding all cache data).

On the example system above, this means:

4026MB – (1184 + 16) = 2826MB

By looking at the difference between these two results, you can see that my laptop currently has 50MB worth of disk cache memory pages on the modified list.

So what is “commit?”

Earlier I said that measuring physical memory usage has been tricky in the past, and that the labels used in Windows haven’t necessarily helped matters.  For example, in Windows Vista’s Task Manager there is a readout called “page file” which shows two numbers (i.e 400MB / 2000MB).  You might guess that the first number indicates how much page file is in use, and the second number indicates the amount of disk space allocated for use – or perhaps some sort of maximum which could be allocated for that purpose.

You would be wrong.  Even if you disabled page files on each of your drives, you would still see two non-zero numbers there.  The latter of which would be the exact size of your installed physical RAM (minus any unavailable to the OS because of video cards, 32-bit limitations, etc).  Unfortunately, the label “page file” didn’t mean what people thought it meant.  To be honest, I’m not quite sure why that label was chosen.  I would have called it something else.

In Windows 7, that label changed to “Commit.”  This is a better name because it doesn’t lend itself as easily to misinterpretation.  However, it’s still not readily apparent to most people what “commit” actually means.  Essentially, it is this:

The total amount of virtual memory which Windows has promised could be backed by either physical memory or the page file.

An important word there is “could.” Windows establishes a “commit limit” based on your available physical memory and page file size(s).  When a section of virtual memory is marked as “commit” – Windows counts it against that commit limit regardless of whether it’s actually being used.  The idea is that Windows is promising, or “committing,” to providing a place to store data at these addresses.  For example, an application can call VirtualAlloc with MEM_COMMIT for 4MB but only actually write 2MB of data to it.  This will likely result in 2MB of physical memory being used.  The other 2MB will never use any physical memory unless the process reads from or writes to it. It is still charged against the commit limit, because Windows has made a guarantee that the application can write to that space if it wants.  Note that Windows has not promised 4MB of physical memory, however.  So when the process writes there, it may use physical memory or it may use the page file.

This is a great example of why disabling your page file is a bad idea. If you don’t have one, Windows will be forced to back all commits with physical memory, even committed pages which are never used!

Further, processes may be charged against the commit limit for other things.  For example, if you create a view of a file mapping with the FILE_MAP_COPY flag (indicating you want Copy-On-Write behavior for writes to the file view), the entire size of the mapped view will be charged as Commit… even though you haven’t used any physical memory or page file yet.  I wrote a simple scratch program which demonstrates this:

int wmain(int cArgs, PWSTR rgArgs[])
if (cArgs == 2)
         HANDLE hFile;
         hFile = CreateFile(rgArgs[1], GENERIC_READ | GENERIC_WRITE, 0, nullptr, OPEN_EXISTING, FILE_ATTRIBUTE_NORMAL, nullptr);
         if (hFile != INVALID_HANDLE_VALUE)
             HANDLE hMapping;
             hMapping = CreateFileMapping(hFile,
nullptr, PAGE_READWRITE, 0, 0, nullptr);
             if (hMapping != nullptr)
                 void *pMapping = MapViewOfFile(hMapping, FILE_MAP_COPY, 0, 0, 0);
                if (pMapping != nullptr)
                     wprintf(L"File mapped successfully.  Press any key to exit and unmap.");
    return 0;

Before running this program, let’s take a look at Task Manager again.


Now, if I run this scratch program and pass it the path to my Visual Studio 2010 Beta 2 ISO image (a 2.3GB file), the Task Manager readout changes to:


Notice how my physical memory usage is unchanged, despite the fact that Commit has now increased by the full 2.3GB of that file.

In fact, my commit value is now 6GB, even though I have only 4GB of physical memory and less than 3GB in use.

Note: It is not common for applications to commit enormous file mappings in this way.  This is merely a demonstration of Commit and Used Physical Memory being distinctly different values.

From → Technology

  1. Phil Sweet permalink

    Excellent. Thanks for taking the time to write this up.

  2. Great article. It’s always refreshing to read about this topic from somebody who knows what he is talking about. Sadly, this is rare these days.

  3. Thank you Brandon, I scoured the web for hours until I found your article above. Authoritative and answered all my questions.

    Best wishes,
    Derek Williams.

  4. avinash. permalink

    An excellent article.

  5. Judi permalink

    Great article. Microsoft’s explanation is HORRIBLE. I actually understand it now.

    Thanks so much.

  6. B& C Cabinets permalink

    Would this be why our AutoCad crash with out of memory error?

  7. Despot permalink

    you are of great help, thank you for taking the time to write such a great and self explanatory article.

  8. Istvan permalink

    THANK YOU FOR THIS! Exceptionaly good and usefull lecture! Like a good, pro teacher!

  9. TomK permalink

    I don’t think your explanation is completely correct. Windows 7 64 bit knows I have 12G installed, but in the Physical Memory (MB) area I see:

    Total 6135
    Cached 3679
    Available 3607
    Free 0

    Yet if you go into the resource monitor, it gives me roughly the same numbers, they also have a field “Installed” and it shows 12288 MB of memory.

    So the Total appears to be ???? don’t know.

    An interesting quirk in resource monitor it shows “hardware reserved” to be 6153

    So I’m still puzzled.

  10. @TomK –

    This is probably because your motherboard supports a maximum of 8GB of addressable memory. 64-bit chipsets never provide the full 64-bit addressable range (no CPU or OS even supports it right now). Some max out at 8GB, some at 12, some much higher. This means that if you put that much (or more) RAM into the system, you won’t be able to use it all. The best you get is (Max Addressable) – (Reserved by devices) = (Available to Windows).

  11. Goodtime permalink

    Big thanks for your article!

  12. Eli permalink

    Thanks Allot,

    what the recommendation for managing the Page file with the memory Managment?

  13. Alex permalink

    This is one of the few times I’ve read an article online and was moved to thank the author. It was simple, easy to understand, and dispelled a lot of my misconceptions. Thanks so much.

  14. Niong permalink

    thnx for writing and explaining it. great post

  15. MLO permalink

    You gave an example of why disabling the pagefile was a bad idea. At some point I don’t think that holds true. Say if you have Windows 7 x64 with 12 gigs of physical memory. Under those circumstances I just don’t see why a page file is needed or wanted. Perhaps you will comment on this.

  16. @MLO – Even if you have 12GB of RAM, why would you want to waste some of it? If you disable the page file, that will happen (because all of your commit limit will be charged against RAM even when it’s never actually used).

    The only benefit of disabling the page file is the disk space you get back. But in most circumstances that benefit is not particularly important.

  17. Hi Brandon, Thanks for the info. Can you also tell about the “commit limit” please.

    I’m monitoring the windows 7 system performance using “perfmon” and I want to know the threshold for “Memory\committed bytes in use” counter.

    After reading few articles on net I have taken the threshold = to system’s virtual memory (virtual memory details are taken from system information window). But the values shown in task mgr and the system info window are different. System info window is showing 2.5GB and task manager is showing 4094 MB i.e 4 GB. So which one to be considered as the threshold. I’m bit baffled because of these 2 different values. Can you please help me. Thanks in advance.

  18. Carlos permalink

    Hi Brandon, Thanks for the article, great explanation, however I agree with MLO, If I have more than enought memory I think that disabling the page file its a good idea, that is the only way (I think) that you can guarantee that nothing goes to the disk gaining in performance, the whle idea of the page file is to “simulate” more memory but what happend if you are totally sure that you can backup everything with physical memory, another thing to be considered is that the page file can be a security risk, I use a lot of encrypted info and I don´t want to have the risk some of this info ends in the page file, I know to can cypher that too but that imposes another perfomance cost, let me know please what do you think, bye

  19. Rick Oshea permalink


    I spent my entire last check upgrading the RAM on my Windows 7 PC. I had 4GB and now I have 8GB. More than anything, this has changed my attitude–now I know more than any of the hardware/software engineers who preceded me. I have a big stick by my keyboard and I wave it menacingly at my PC, yelling threatening things such as “I have 8GB and now I make the rules!” and “I don’t need no stinkin’ Page File!”. I believe it, too, because it comes from my own lips. Now I’m in charge. It’s a good thing I’m so powerful too, as something has to make up for my lack of knowledge.

    I have no girlfriends, but I have lot of RAM. Surely that makes up for it.

    Your silicon pal,
    Rick O’ Shea

  20. Carlos permalink

    Hahaha you must change your memory provider or change your job, because 8 GB only cost US40

  21. Vinoth permalink

    Awesome!!! Very Clear and detailed.

  22. That was really helpful as I got a new laptop on Friday for photo editing and it is my first win7 machine and I thought for some reason I didn’t have as much memory as I ordered, however this has really helped me understand what is going on with it all.

    Thanks :)

  23. Reg permalink

    As others have mentioned:
    in the internet many THINK they know, but few really know…

  24. Thuh – and I might add – anks. I was going nuts without this info.

    Thanks again.

  25. Don T. permalink

    I was led to believe that “Free” under Physical Memory to be a Percentage, not MBs. So in the case above it would be telling you that 16% of the Physical Memory fits the definition of “Free.” If you recalculate some of your numbers using this % concept does that make more sense in the results? Or is someone all wet in the assertion that “Free” is a percentage? (I’ve never seen it over 100, myself. So that made sense to me.)

  26. Saurabh Chokhra permalink

    Please help me to calculate pecentage of file loaded in application?
    Please i am searching it from last 4 days but couldnot find a solution.

  27. Another comment on page files that I haven’t seen addressed anywhere, even with all the testing I’ve read about. I agree that the performance testing difference between not having a page file and having a page file is nada so you should never not have a page file. The problem with all this analysis is that it always compares a page file on a brand new machine running very few applications. In my case I regularly have 400 or more windows open at a time including many multiple Word docs, lots of Excel spreadsheets, Chrome, Firefox, Opera, Adobe Reader, Foxit PDF Reader, etc. (I use multiple products in case one crashes the it only affects a subset of my websites/documents, etc.). Right now, I’m doing this and 6 gig of memory is used with 6 gig available. With so many windows open I can only reboot about once a month at patch time and then it can take me hours to shut down and start up again. I hibernate the machine every night.

    It felt that when I used to run my Win 7-64, 12 gig machine with a page file it was slower, probably because my page file was so fragmented, affecting performance, even though I had lots of available memory. Now, with no page file, everything is written to memory and I don’t care about fragmentation. If I ever get to the point where I begin to run out of memory I’ll take the machine to 16 gig.


  28. igor permalink

    Thanks for the article! Very good and clear explanation.

  29. nice one!

  30. Richard permalink

    64-bit is the way to go with RAM and OS/App appetites these days. With x86 on Windows 7, physical RAMis limited to 4GB — actually 3GB considering what’s taken by video. See Plus memory is twice as wide with 64-bit, increasing throughput to load the same amount of data as 32-bit mem.

  31. nand permalink

    great article definitedly answered my questions

  32. sourabh permalink

    A superb explanation

  33. Tim permalink

    Nice! Thanks for taking the time to write it up.

    I’m trying to figured the same thing from a PerfMon trace. The Available bytes is reported, but it’s not clear to me which values add up to Cached. And PerfMon does not report the Total (Configured) value nor the percent in use.

    Any insights?

  34. Tim permalink

    I think I’ve found the Available, Cached, and Free values in the Perfmon counters:

    \\TIM\Memory\Available Bytes
    \\TIM\Memory\Cache Bytes
    \\TIM\Memory\Free & Zero Page List Bytes

    Although, if I do the two different calculations for Memory In Use Brandon outlined above, I get opposite results:

    Total – (Cached + Free) < Total – Available

    rather than the converse.

  35. poulami pal permalink

    great article sir!! very helpful…

  36. Steve permalink

    First time I’ve read an article on this topic that actually makes sense. Even had a discussion with the hosting provider of our company web site when were were running low on memory and they couldn’t explain what the number meant. They said just send us more money for more memory and it will be fixed.

  37. crokusek permalink

    > This is a great example of why disabling your page file is a bad idea. If
    > you don’t have one, Windows will be forced to back all commits with
    > physical memory, even committed pages which are never used!

    I don’t understand completely the concept of commit so maybe more explanation is needed. But on your comment quoted above–who cares whether commits are backed with physical memory? That seems like a good thing when running without a page–guaranteeing no use of hard drive as a cache.

    Also, as windows realizes it is near the end of exhausting physical memory, I would expect the algorithms to get less greedy about anything that is not truly needed such as your claim about “committed pages” that are never used. An article demonstrating any inefficiencies with pageless mode would really help. It sounds like you are saying some % of physical memory is moot while running in pageless mode which I find hard to believe.

  38. smbrooke permalink

    Many thanks, Brandon. Great exposition and certainly clarified matters for me.

  39. Col Sanders permalink

    Brandon – like the others have said … thanks for a great article.

    I’m currently trying to understand why a 3MB Excel 2010 spread sheet comes up with “Excel cannot complete this task with available resources” error since upgrading from WinXP to Win7 64bit.
    The failure occurs when the TM counters show:
    Physical Memory: 76%
    Commit = 3544 / 7956
    Armed with your excellent explanation of the various counters (in conjunction with other Perfmon counters) hopefully I can determine what the heck is happening.

  40. jaranF permalink

    An absolutely fantastic article. Thanks. I too had been searching on the web for a while and nothing came close to this. I like your articles exhaustive coverage of the subject (going into the history as well) and its ability to anticipate my next questions

  41. Bob Vanderbloemen permalink

    Great post, best explanation I’ve ever seen for all these categories.

    Thanks for takin the time!

  42. What a shame all this is bullshit! No pagefile is bad, nope don’t think so. HDD is slower than RAM, so why use HDD if RAM is there and in mass amounts? No point… And free memory does not matter, you sir are speaking bull shit! So when I have 1500MB available that means I do not need more ram to do more tasks? No thats a lie too, I struggle to do basic chrome browsing but when I popped in 4GB more, I now have LOTS of FREE RAM and everything runs sweet. So your whole story is lies and must not be taken with any credibilty by anyone. If you are low on RAM, get an upgrade like I did, if you want a faster PC, get an upgrade. It is so simple but people never learn. The slowness can also be caused by malware or other unwanted software also, best to look into that before spending money on anything.

  43. Steven – Perhaps you should read the entire post before replying next time. This would save me further /facepalms.

    Everything in the post is 100% accurate. As a former senior developer on the Windows team I’m well versed in the subject.

    I think you confused the notion that “free RAM is wasted RAM” with some idea that “more RAM isn’t useful.” Obviously, you are confused as I never once suggested the latter. If you didn’t have enough to do your tasks without paging, then obviously you did not have any free RAM, otherwise it wouldn’t have been free! So of course upgrading it helps you.

    If you used to be exceeding your RAM availability by 1-2GB, and then you added 4GB, you shouldn’t be surprised when you have some that’s free! Of course, as I described in the post, Windows will try to make use of most of that for the disk cache and superfetch (because free RAM is wasted RAM!). But eventually you’ll reach a point where it just doesn’t have anything useful to put there sometimes, or where it reaches the maximum amount of I/O it wants to do to preemptively populate things.

    No page file is bad. I explained in great detail the reasons why. You haven’t disputed any of them. So go read the post, and perhaps learn something!

  44. Jack permalink

    Brandon – Great article!

    Virtual memory is an integral part of a modern computer architecture, and no one could deny its benefit.

  45. Todd permalink

    When I left my pagefile on auto, Win7 would allocate 16 gb to my SSD for viritual memory space, and then another 8 gb+ for hibernate mode. That is half of my drive! I hate that Microsoft prefetches/caches everything. I cannot open internet explorer after my “free” memory is under 300 mb from a possible 16 Gb! I just disabled superfetch and will restart the PC. It is very troublesome to watch my available ram go to 0 and NOT release it, ever.

  46. Jack permalink

    Great stuff….Now I now why my computer is so slow……

  47. Lon permalink

    Brandon – This was an incredibly informative article. The entire idea of understanding how memory is used is a challenge. I wonder – might this entire discussion of how Windows 7 is used be essentially the same in Windows 10 or have there been fundamental changes to memory management with this overhaul?

  48. powershellur permalink

    “Now, what’s missing from this list? Perhaps, a measurement of “in use” memory. ”

    i dont accept this, as it is shown clearly in the form of a graph just above these things

Trackbacks & Pingbacks

  1. Windows 7 memory usage: What’s the best way to measure? | Ed Bott’s Microsoft Report |
  2. Measuring memory usage in Windows 7 « C.R.T.A.G. Blog
  3. HELP: Task Manager!!!!!
  4. Win7 64 - Installed 8GB, it sees 8, but only 5.3 physical memory used
  5. Windows Task Manager | Warren Tang's Blog
  6. Mici intrebari care nu merita un thread separat. - Page 819 - My Garage
  7. Anonymous
  8. Cached memory refers to both cached memory (that is currently usable) and used memory (that was previous cached)?
  9. Windowsのtask managerにおけるmemory表示とその実体 « SAND STORM
  10. Computer experiences high memory usage at the end of the day for no reason. -
  11. Magix Movie Edit Pro 2013 Plus « dirkvoorhoeve
  12. System Performance
  13. Posts 365+L | Collecting the Gems of Knowledge

Leave a Reply

Note: XHTML is allowed. Your email address will never be published.

Subscribe to this comment feed via RSS