Tech Support Guy banner
1 - 12 of 12 Posts

·
Registered
Joined
·
10 Posts
Discussion Starter · #1 ·
In the last two weeks I have been observing more and more frequent Out of Memory errors while using Google Chrome.
My impression is that the issue occurs when I visit webpages using WEBGL, mostly (but not only) Google Maps.

My computer's data:
CPU: Intel(R) Core(TM) i5-9400F (2.90GHz)
RAM: 8 Gb
GPU: NVIDIA GeForce GTX 1650, with fully updated drivers (4 Gb DDR5)
OS: Windows 11 Pro 21H2 64 bit, fully updated
The OS is installed on a SSD disk; there is no much free disk space on the SSD (about 16 Gb), and virtual memory was initially set to max 2 Gb on another disk.
I have never had any memory issue until now, though.

So far i have tried:
1) Rebooting (the issue doesn't occur immediately, but after a while)
2) deleting cookies, restoring and reinstalling chrome (the issue doesn't occur at first, but eventually occurs after a while)
3) Checking for malware (I run MBAM - no malware detected)
4) Increase Virtual Memory from 2 Gb to 8 Gb (the problem persists, even if less frequently).

I also think the problem may affect GPU performance, as I have noticed - on chrome://gpu/ page - that WEBGL and WEBGL2 acceleration had somehow been automatically disabled.

Since Chrome detects it as a out of memory issue, what can I do in order to log Chrome memory usage and eventually find out what causes the issue?
Any other suggestions?
 

·
Trusted Advisor
Joined
·
25,951 Posts
I suspect the cause is the paging file being on another drive rather than on the OS drive
The OS is installed on a SSD disk; there is no much free disk space on the SSD (about 16 Gb), and virtual memory was initially set to max 2 Gb on another disk.
WHAT total capacity is the SSD please


This is the one of the more comprehensive guides on how to solve the error
Here are the ways to fix 'Google Chrome Is Out of Memory' error — Auslogics Blog

Do not of course use any of the advertised repair tools - on the page ADVERTS
 
  • Helpful
Reactions: 2twenty2

·
Trusted Advisor
Joined
·
908 Posts
The page file on another disk "should" not be causing this problem. In fact, having the PF on a secondary drive is a common practice. And it works great assuming there is plenty of free space on the other disk. So how much free space is on that secondary drive?

Unless you are a true expert in memory management and fully understand commit rates and how to calculate the correct PF size - it is best to just let Windows manage the PF size. Contrary to what some folks want us to believe, the teams of PhDs, computer scientists and developers at Microsoft really are true memory management professionals, with decades of experience, exabytes of empirical data, and supercomputers to run scenarios on - ensuring Windows know how to manage system and virtual memory optimally.

One example of how Windows uses the PF optimally (if users don't dink with the defaults) is the true memory management experts discovered the PF size requirements frequently change as the user changes tasks, or as hardware demands change. So by default, the PF size is "dynamic" and not a fixed size. It will change as necessary because those professionals learned the PF size is NOT a "set and forget" setting.

For sure, you can have multiple page files on multiple disks. And this is common practice when the boot disk is small and there is lots of free space on secondary drives. However, ideally, you want the PF on the fastest drive. Once again, Microsoft and Windows know how to deal with that too and Windows will detect and use the PF on the fastest drive for the higher priority data - assuming there is the necessary free space available to do so.

Windows (all operating systems, actually) needs a big chunk of free space to operate optimally in. This is not just for the PF. It is also for temporary files. And not just cookies and other temporary Internet files either. EVERY TIME the OS opens a system file, and EVERY TIME an application opens an application or data file, a temporary copy is written to the disk too. This is to help ensure no data loss or file corruption should some catastrophic event occur (system freeze, power outage, etc.). When the OS or app is done with that file, the modified version is saved, and the original/temp file is deleted.

Without sufficient free disk space - you get an "Out of memory" error.

I recommend you first, change all your PF settings back to Windows managed. Then, go through your boot disk and uninstall any program you downloaded and installed, but don't use. Uninstall them. Other programs you downloaded and installed but want to keep should be moved or [better yet] uninstalled and re-installed on your larger secondary drives.

Consider moving your default Documents folder to a secondary drive. To move your Documents folder to another drive, D drive in this example, do the following:

1. Create a new Documents folder on the D drive,
2. Right-click on the new folder and click Include in Library > Documents,
3. Click Start > Documents,
4. Double-click Documents to open and reveal its contents,
5. Drag and drop (or cut and paste) to move the files to the new folder,
6. Right-click in a blank area then click on Refresh, or press F5 to refresh the view,
7. Under Documents Library, click locations,
8. Right-click the new folder and click Set as default save location.

9. [Optional] Click the old Documents folder and click Remove. Or at least, delete the files in the folder​

I recommend you follow the same procedure for your default Downloads folder too.

Note a big space hogging culprit is often the windows.old folder. This HUGE folder is created during major Windows upgrades and can be used in the rare event the upgrade fails, to "roll back" to the previous version. This folder is "supposed" to be deleted automatically after 30 days - but sometimes, for whatever reason, it does not get deleted. So use File Explorer and look for any instances of windows.old. If the timestamp indicates it is older than 30 days, you can safely delete it. If it is less than 30 days, that means a large update was recently installed. "IF" you feel your computer is working fine, and you feel there will be no need to roll back, you can still safely delete that folder before the 30 day point.

restoring and reinstalling chrome (the issue doesn't occur at first, but eventually occurs after a while)
This is major "reg flags" for me. Chrome has a long, notorious history of hogging resources by opening up (and leaving open) potentially dozens and dozens of processes. In fact, it is because of such a problem, and such a complaint area, that Google created a special Google Chrome Task Manager just so users can see how many resources it is hogging, and easily kill them off individually (if they don't want to simply exit and restart Chrome). Start the Google Chrome Task Manager by right-clicking blank space in the Chrome title bar, then select Task manager.

Note that Microsoft Edge (which is built on the Chromium engine) has a similar applet they call Browser Task Manager. While Edge seems to handle resources better than Chrome, it is not immune to resource hogging issues either.
 

·
Trusted Advisor
Joined
·
25,951 Posts
deleted so as to not confuse thread starter
 

·
Trusted Advisor
Joined
·
908 Posts
That is not always correct
:( That's pretty much what '"should" not' implies. Exceptions don't make the rule!

In fact, the scenario you just described perfectly illustrates why I said users should just let Windows manage the PFs. It is poor resource management (by the user!) to disable the PF on the boot (and fast) SSD and then rely on a PF on an agonizingly slow, bottleneck of a harddrive, secondary disk. :(

Windows is smart enough to leave the main PF on the drive that contains the OS. In that scenario, clearly, the user is not. And in that scenario, it is the user that failed to properly configure their computer! And of course, it is user's responsibility to ensure the boot drive has the necessary free disk space.

If the user does not know how to properly do all that, they should leave the defaults alone!

This is why I also said,
For sure, you can have multiple page files on multiple disks. And this is common practice when the boot disk is small and there is lots of free space on secondary drives. However, ideally, you want the PF on the fastest drive. Once again, Microsoft and Windows know how to deal with that too and Windows will detect and use the PF on the fastest drive for the higher priority data - assuming there is the necessary free space available to do so.
...and...
I recommend you first, change all your PF settings back to Windows managed.
 

·
Super Moderator
Joined
·
20,018 Posts
Hi Macboatmaster and Digerati,
Let us not forget there is another forum you could use for the debate that you seem to be having.

Meanwhile, are we not clouding the issue and getting away from the original poster's question?
 

·
Trusted Advisor
Joined
·
908 Posts
You are right. I have no desire to debate "what if" scenarios that don't represent the norm. I am just trying to help the OP with possible solutions based on possible causes - hence my recommendations.

With that, I'm moving on and Happy Thanksgiving to all.
 

·
Registered
Joined
·
10 Posts
Discussion Starter · #9 ·
:( That's pretty much what '"should" not' implies. Exceptions don't make the rule!

In fact, the scenario you just described perfectly illustrates why I said users should just let Windows manage the PFs. It is poor resource management (by the user!) to disable the PF on the boot (and fast) SSD and then rely on a PF on an agonizingly slow, bottleneck of a harddrive, secondary disk. :(

Windows is smart enough to leave the main PF on the drive that contains the OS. In that scenario, clearly, the user is not. And in that scenario, it is the user that failed to properly configure their computer! And of course, it is user's responsibility to ensure the boot drive has the necessary free disk space.

If the user does not know how to properly do all that, they should leave the defaults alone!

This is why I also said,
...and...
I am not PF management expert, and I have let Windows manage it until I changed my OS disk with a SSD. However, I have been advised to disable it on the SSD as "Pagefile can wear out your SSD prematurely".
I must add that I did the SSD upgrade with Windows 7 on another computer. In the meanwhile, I upgraded to WIndows 10, I transplanted my OS disk on a new PC and upgraded to Windows 11 and everuthing has been always working as a charm.
I can't exclude it could be part of the problem, but I doubt the issue originates from my PF management.
 

·
Trusted Advisor
Joined
·
25,951 Posts
I use a third disk (500 Gb) for downloads, Conda environments, temporary installations and a 8 Gb PF.
Is that disk a spinning hard drive
If so having the Page file on a spinning hard drive with the OS on a SSD is not good practice as the spinning hard drive is always trying to catch up with the SSD

I suggest you start with the various options on the link I sent in my first reply
 

·
Trusted Advisor
Joined
·
908 Posts
However, I have been advised to disable it on the SSD as "Pagefile can wear out your SSD prematurely".
:( Sorry, but that is total nonsense. Whoever told you that is living in the Stone Age.

Limited write cycles for SSDs has not been a problem for years now. And frankly, it never was for "normal" computer users. And normal is 99% of us.

A large percentage of laptops on the market today, and more and more PCs come with SSDs only - no hard drive. And by default, they have Windows managed PFs on that SSDs. With TRIM and wear leveling, it takes years of constant writes to wear out a SSD. And that just does not happen with normal users. Let's not forget that the vast majority of PF activity is reads, not writes. And reads have no effect on longevity.

If SSDs wearing out was a problem, these laptop and PC makers would still be using hard drives.

It is more likely the power supply, motherboard, CPU, graphics or another component will die before the SSD. And it is much more likely the user will simply retire the computer for something new and current before the SSD dies.

If you look at this budget $60 500GB SSD, it has a TBW (terabytes written) rating of 300TBW. That means, you would have to fill that 500GB drive up, then delete everything, and fill it up again 600 times before you reached its write limit. How likely is that? 1TB is 1000GB. So 300TB is 300,000GB. Do you see yourself writing that much data to a drive?

Not to mention, that "budget" SSD has a 5 year warranty. Most hard drives these days have 1 year, maybe 3. You have to go with expensive "enterprise" HDs to get 5 year warranties.

And let's not forget, hard drives have 2 motors and other mechanical (moving parts) components. They are big, heavy, noisy, power hungry, heat generating and slow.
 
1 - 12 of 12 Posts
Top