So I have a gentoo server and I’m currently recompiling a bunch of haskell libraries (you know, as you do) and I noticed that as I started building, the cached ram went down. I don’t think this is necesarily a problem, but it’s a bit confusing to me. I am using
ccache with these builds so idk if that plays into it.
Is this just because everytime I build a new package, I’m invalidating the cache that came from the previous package? Should I be worried?
pretty netdata screenshots to show the inverse releation ship between cached ram and cpu usage:
If your build system holds the previous package in memory after being built, it should be normal to see it go down in used space as after you start a new package it purges all the components it doesn’t need from the last build and keeps what small instructions and tools it will think it needs for the next one.
Think of it like a tool box. You have all the basic shit you need in the top trays, but when you need other tools you hane to go grab them, and when you’re done with them you put them away. Kinda like… you wouldn’t store a hammer drill with your screw drivers. It just wastes space. But damn, you are always needing like 2 or 3 different screw drivers. Or whatever.
What would be a concern I think is if your cache completely wiped after each build. Not only would it take longer for the build to go thru, but it also has a chance of failing if it doesn’t have the last tools referenced or even have registered space to operate.
If its a huge concern and you want to make sure, check where your cache is in memory. If it dances around up and down the regishers or randomly dissappears during a build then that could be a thing to check out.
TLDR: nah its just holding what it needs and clears out what it doesn’t need. If cache cleared itself 100% each build, then you might have a problem.