Xorg 3D acceleration question (Alder lake)
Hello, So the ArchWiki can be very confusing about this all, and I am sure a lot of people will just install every package and will just hope that it does something, but I would like to actually understand it, so sorry for the ML noise, I would like some clarifications from those who are knowledgeable about this topic. So reason I got a new Laptop with alder lake, and I am trying my best to optimise it, I dropped 70% battery in just 2 hours when its a 73wh battery, that is absurd, so I checked the two following which is recommended to improve performance and battery life: 1. 3D acceleration is in use 2. Hardware video acceleration is in use As for 1, it was by default, however the ArchWiki on the xorg page (https://wiki.archlinux.org/title/Xorg#Driver_installation) lists the following driver to be installed for xorg: xf86-video-intel Now running glxinfo before the installing of said package, direct rendering is enabled (DRI), which would indicate that 3D acceleration is working as needed. So I am a little confused on why it works without having the xorg driver installed? I have the xorg metapackage installed (I am lazy, and the xorg utilities are useful). is xf86-video-intel used to enable DRI3? Is DRI2 possible without said driver? The wiki is really not clear about this, and I know a lot of people probably wonder why it matters if 3D acceleration is already enabled, but I would like to know what is going on :) Also, saying "Just use wayland" is not a valid answer, already had this happen before so just putting it out there. I would appreciate a detailed explanation on what is going on to clear the confusion. Now back to point 2, this was not enabled, VAAPI is the Intel API for hardware video acceleration and after installing the required driver (intel-media-driver) vainfo (from libva-utils) successfully shows support for Hardware video acceleration. If anyone else has any suggestions or experience with optimising alder lake mobile chips on Arch Linux, I would also appreciate the pointers. As a small note, I used AMD (ryzen 5 5500U) with my old laptop, I can not remember if I could be bothered to setup hardware video acceleration or going through optimising the integrated graphics, I ran that Arch Install for 2 years and typical me doesn't document any of my configuration or dotfiles, so I am trying to piece distant memories back together, lucky the ArchWiki is a good refresher :P Thank you for your time, -- Polarian GPG signature: 0770E5312238C760 Website: https://polarian.dev JID/XMPP: polarian@polarian.dev
On Thu, 10 Aug 2023 03:17:15 +0100 Polarian <polarian@polarian.dev> wrote:
Hello,
So the ArchWiki can be very confusing about this all, and I am sure a lot of people will just install every package and will just hope that it does something, but I would like to actually understand it, so sorry for the ML noise, I would like some clarifications from those who are knowledgeable about this topic.
So reason I got a new Laptop with alder lake, and I am trying my best to optimise it, I dropped 70% battery in just 2 hours when its a 73wh battery, that is absurd, so I checked the two following which is recommended to improve performance and battery life:
1. 3D acceleration is in use 2. Hardware video acceleration is in use
Hi, before checking graphics, did you check powertop? It should be able to tell which part is consuming power.
Hello, No I did not actually check powertop, probably should have. I have never actually used powertop, for as far as I can remember, I have taken a look at the manpage and the ArchWiki page, and the following seems to have the highest events (so I assume the highest power use?), see the top 3 below (in order): - tick_sched_timer - HI_SOFTIRQ - i915 So third place is the driver for the integrated intel graphics, so I assume it was a valid assumption that the graphics is one of the major power draws. Is there something specific you are looking for from powertop, it dumps a lot of data and someone who is new to the utility isn't going to get much use out of it at 3:39am in the morning. Not asking to be spoon fed, just stating I don't have time to read about how to interpret the data at 3am, sorry, so if there is something specific you would like, I can provide it. Thank you for the response, -- Polarian GPG signature: 0770E5312238C760 Website: https://polarian.dev JID/XMPP: polarian@polarian.dev
On Thu, 10 Aug 2023 03:40:09 +0100 Polarian <polarian@polarian.dev> wrote:
Hello,
No I did not actually check powertop, probably should have.
I have never actually used powertop, for as far as I can remember, I have taken a look at the manpage and the ArchWiki page, and the following seems to have the highest events (so I assume the highest power use?), see the top 3 below (in order):
- tick_sched_timer - HI_SOFTIRQ - i915
So third place is the driver for the integrated intel graphics, so I assume it was a valid assumption that the graphics is one of the major power draws.
Is there something specific you are looking for from powertop,
IIUC you are looking at the "Overview" pane, and your top 3 basically says your system is idle. My system does not show HI_SOFTIRQ here, no idea which part it corresponds to... Anyway integrated GPU usually does not draw a lot of power. Maybe check "Idle stats" and "Frequency stats" to be sure your CPU is actually put to sleep; on my i5-12400 the cores are >99% in C10, frequency <800MHz. Also check "Device stats" just in case; I have seen SD card readers constantly consuming power without cards inserted.
Hello, To reduce noise I will reply to both Carl and eNV25 in a single email. Carl:
IIUC you are looking at the "Overview" pane, and your top 3 basically says your system is idle. My system does not show HI_SOFTIRQ here, no idea which part it corresponds to... Anyway integrated GPU usually does not draw a lot of power. Maybe check "Idle stats" and "Frequency stats" to be sure your CPU is actually put to sleep; on my i5-12400 the cores are >99% in C10, frequency <800MHz. Also check "Device stats" just in case; I have seen SD card readers constantly consuming power without cards inserted.
Yes this was taken at idle, but after further testing this does not seem to be the problem. I believe the Hardware video acceleration has actually made a noticeable change, but I won't know until I have let the battery drain a little and see how long it will last. I wonder if there is a chance that the battery has a bad BMS, I do not see how 73wh could be drained in 2-2.5 hours because that would be an average of 29.2w which you would definitely feel, but the laptop did not feel very hot either... Maybe it is just because its a new battery? I know some devices (mainly android phones) recommend charge cycling the battery a few times for the battery life to be properly represented by the operating system, but I do not know if the same applies to Linux. eNV25:
Actually, xf86-video-intel is not really recommended for most people: https://wiki.archlinux.org/title/Intel_graphics#Installation
Ah I completely missed this, I was looking at the table. Thank you for pointing it out. As a sidenote, when I did have xf86-video-intel enabled and I played a youtube video, the integrated graphics clocked at max and CPU temps jumped to 80C in seconds, and there was ~23w drain on the battery, which isn't good :/ Simply remove the driver and rebooted and when you play the same video, issue doesn't exist. So yes this was definitely a mistake installing.
Make sure Early KMS is enabled. https://wiki.archlinux.org/title/Kernel_mode_setting#Early_KMS_start
Sorry for asking, but why is this useful? What benefit is there by loading the graphics driver early? As the kernel automatically loads it when it boots later down the line (along with any other drivers for hardware it detects). Just curious to see how this would be useful. As a note after some investigation with powertop, it does seem the integrated graphics draw a considerable amount of power decoding 1080p video, it is 8w extra to watch a video on youtube. And even more funny it jumped 15w when watching through mpv. What the hell is wrong with integrated graphics, 1080p video should not stress modern integrated graphics, so why is it drawing so much power? no wonder the laptop battery died in 2 hours, watching any video, both through firefox and through mpv is drawing almost 30w!!! That is as much as a desktop CPU in some cases, and this is a mobile chip, am I missing something? This laptop is quoted to be able to get 7+ hours of battery life. I have heard that alder lake with its big.LITTLE clone has issues on Linux where performance cores are used when efficiency is meant to etc. C7 seems to be the most used C state. The issue is so weird though, I have retried youtube through firefox and it only draw 4w extra than idle. Why is this so inconsistent? Any ideas would be helpful :) Thanks for the help, -- Polarian GPG signature: 0770E5312238C760 Website: https://polarian.dev JID/XMPP: polarian@polarian.dev
On Thu, Aug 10, 2023 at 12:12 PM Polarian <polarian@polarian.dev> wrote:
Hello,
To reduce noise I will reply to both Carl and eNV25 in a single email.
Carl:
IIUC you are looking at the "Overview" pane, and your top 3 basically says your system is idle. My system does not show HI_SOFTIRQ here, no idea which part it corresponds to... Anyway integrated GPU usually does not draw a lot of power. Maybe check "Idle stats" and "Frequency stats" to be sure your CPU is actually put to sleep; on my i5-12400 the cores are >99% in C10, frequency <800MHz. Also check "Device stats" just in case; I have seen SD card readers constantly consuming power without cards inserted.
Yes this was taken at idle, but after further testing this does not seem to be the problem.
I believe the Hardware video acceleration has actually made a noticeable change, but I won't know until I have let the battery drain a little and see how long it will last.
I wonder if there is a chance that the battery has a bad BMS, I do not see how 73wh could be drained in 2-2.5 hours because that would be an average of 29.2w which you would definitely feel, but the laptop did not feel very hot either...
Maybe it is just because its a new battery? I know some devices (mainly android phones) recommend charge cycling the battery a few times for the battery life to be properly represented by the operating system, but I do not know if the same applies to Linux.
eNV25:
Actually, xf86-video-intel is not really recommended for most people: https://wiki.archlinux.org/title/Intel_graphics#Installation
Ah I completely missed this, I was looking at the table. Thank you for pointing it out.
As a sidenote, when I did have xf86-video-intel enabled and I played a youtube video, the integrated graphics clocked at max and CPU temps jumped to 80C in seconds, and there was ~23w drain on the battery, which isn't good :/
Simply remove the driver and rebooted and when you play the same video, issue doesn't exist. So yes this was definitely a mistake installing.
Make sure Early KMS is enabled. https://wiki.archlinux.org/title/Kernel_mode_setting#Early_KMS_start
Sorry for asking, but why is this useful?
What benefit is there by loading the graphics driver early? As the kernel automatically loads it when it boots later down the line (along with any other drivers for hardware it detects).
Just curious to see how this would be useful.
As a note after some investigation with powertop, it does seem the integrated graphics draw a considerable amount of power decoding 1080p video, it is 8w extra to watch a video on youtube. And even more funny it jumped 15w when watching through mpv.
What the hell is wrong with integrated graphics, 1080p video should not stress modern integrated graphics, so why is it drawing so much power?
This is probably because you are not using hardware video acceleration. https://wiki.archlinux.org/title/Hardware_video_acceleration https://wiki.archlinux.org/title/Hardware_video_acceleration#Configuring_app... If hardware video acceleration is enabled and working in your browser, but does not work on YouTube, it is possible your CPU does not support the AV1 (or VP9 or VP8) video codec. Try disabling the AV1 (or VP9 or VP8) codec with this extension: https://github.com/alextrv/enhanced-h264ify
no wonder the laptop battery died in 2 hours, watching any video, both through firefox and through mpv is drawing almost 30w!!! That is as much as a desktop CPU in some cases, and this is a mobile chip, am I missing something?
This laptop is quoted to be able to get 7+ hours of battery life.
I have heard that alder lake with its big.LITTLE clone has issues on Linux where performance cores are used when efficiency is meant to etc.
C7 seems to be the most used C state.
The issue is so weird though, I have retried youtube through firefox and it only draw 4w extra than idle. Why is this so inconsistent?
Any ideas would be helpful :)
Thanks for the help, -- Polarian GPG signature: 0770E5312238C760 Website: https://polarian.dev JID/XMPP: polarian@polarian.dev
Hello,
This is probably because you are not using hardware video acceleration.
As I have said in a previous email in this thread, I did enable Hardware video acceleration, I explained how I beleive this improved the battery life considerably, but again as I have no scientific method of testing this (I need the laptop and i don't want to leave it unattended playing video to get a dataset, I can't full tell :)
If hardware video acceleration is enabled and working in your browser, but does not work on YouTube, it is possible your CPU does not support the AV1 (or VP9 or VP8) video codec.
I used youtube as a backup, I rather use mpv to play the videos and the audio, browsers are notoriously inefficient when doing compression and decompression of audio, so I did not expect to get good battery life within the browser. Also: ❯ vainfo Trying display: wayland Trying display: x11 vainfo: VA-API version: 1.19 (libva 2.19.0) vainfo: Driver version: Intel iHD driver for Intel(R) Gen Graphics - 23.3.0 () vainfo: Supported profile and entrypoints VAProfileNone : VAEntrypointVideoProc VAProfileNone : VAEntrypointStats VAProfileMPEG2Simple : VAEntrypointVLD VAProfileMPEG2Simple : VAEntrypointEncSlice VAProfileMPEG2Main : VAEntrypointVLD VAProfileMPEG2Main : VAEntrypointEncSlice VAProfileH264Main : VAEntrypointVLD VAProfileH264Main : VAEntrypointEncSlice VAProfileH264Main : VAEntrypointFEI VAProfileH264Main : VAEntrypointEncSliceLP VAProfileH264High : VAEntrypointVLD VAProfileH264High : VAEntrypointEncSlice VAProfileH264High : VAEntrypointFEI VAProfileH264High : VAEntrypointEncSliceLP VAProfileVC1Simple : VAEntrypointVLD VAProfileVC1Main : VAEntrypointVLD VAProfileVC1Advanced : VAEntrypointVLD VAProfileJPEGBaseline : VAEntrypointVLD VAProfileJPEGBaseline : VAEntrypointEncPicture VAProfileH264ConstrainedBaseline: VAEntrypointVLD VAProfileH264ConstrainedBaseline: VAEntrypointEncSlice VAProfileH264ConstrainedBaseline: VAEntrypointFEI VAProfileH264ConstrainedBaseline: VAEntrypointEncSliceLP VAProfileVP8Version0_3 : VAEntrypointVLD VAProfileHEVCMain : VAEntrypointVLD VAProfileHEVCMain : VAEntrypointEncSlice VAProfileHEVCMain : VAEntrypointFEI VAProfileHEVCMain : VAEntrypointEncSliceLP VAProfileHEVCMain10 : VAEntrypointVLD VAProfileHEVCMain10 : VAEntrypointEncSlice VAProfileHEVCMain10 : VAEntrypointEncSliceLP VAProfileVP9Profile0 : VAEntrypointVLD VAProfileVP9Profile0 : VAEntrypointEncSliceLP VAProfileVP9Profile1 : VAEntrypointVLD VAProfileVP9Profile1 : VAEntrypointEncSliceLP VAProfileVP9Profile2 : VAEntrypointVLD VAProfileVP9Profile2 : VAEntrypointEncSliceLP VAProfileVP9Profile3 : VAEntrypointVLD VAProfileVP9Profile3 : VAEntrypointEncSliceLP VAProfileHEVCMain12 : VAEntrypointVLD VAProfileHEVCMain12 : VAEntrypointEncSlice VAProfileHEVCMain422_10 : VAEntrypointVLD VAProfileHEVCMain422_10 : VAEntrypointEncSlice VAProfileHEVCMain422_12 : VAEntrypointVLD VAProfileHEVCMain422_12 : VAEntrypointEncSlice VAProfileHEVCMain444 : VAEntrypointVLD VAProfileHEVCMain444 : VAEntrypointEncSliceLP VAProfileHEVCMain444_10 : VAEntrypointVLD VAProfileHEVCMain444_10 : VAEntrypointEncSliceLP VAProfileHEVCMain444_12 : VAEntrypointVLD VAProfileHEVCSccMain : VAEntrypointVLD VAProfileHEVCSccMain : VAEntrypointEncSliceLP VAProfileHEVCSccMain10 : VAEntrypointVLD VAProfileHEVCSccMain10 : VAEntrypointEncSliceLP VAProfileHEVCSccMain444 : VAEntrypointVLD VAProfileHEVCSccMain444 : VAEntrypointEncSliceLP VAProfileAV1Profile0 : VAEntrypointVLD VAProfileHEVCSccMain444_10 : VAEntrypointVLD VAProfileHEVCSccMain444_10 : VAEntrypointEncSliceLP All the codecs you have asked about have a valid entrypoint, thus they are supported for hardware video acceleration. I have been using the laptop for about 3 hours now without charging it, and I am at 67%, which is 33% drop, so I can just times this time by 3, and in theory that means it should last for 9 hours at the current usage rate, but I do need to point out that I have also only been replying to emails and talking on XMPP, which are desktop applications and are relatively efficient. I have noticed that although Intel has tried to make efficient CPUs for laptops, the added cores does cause higher idle energy draw, ideally you want the performance cores downclocked by a lot. I assume you could optimise this for the best battery life by disabiling the performance cores and only keeping the efficiency cores, but at the end of the day this is x86, a complex instruction set, you don't expect good battery life. In all honesty it is not even a big deal, I do not particularly care about the battery life its actually I would like to use as little power as possible, less power == less environmental impact, and also the energy bill is cheaper, the battery life is an added bonus! Like... I do find it funny how people complain about battery life because you almost always have a charger on you, and there is plug sockets everywhere. In cafe, on the train, on the plane even in some bathrooms. I am not going to pass up additional energy efficiency if it is easy to achieve thats all :) Thanks for the help! Take care, -- Polarian GPG signature: 0770E5312238C760 Website: https://polarian.dev JID/XMPP: polarian@icebound.dev
On Thu, Aug 10, 2023 at 3:12 AM Polarian <polarian@polarian.dev> wrote:
Hello,
So the ArchWiki can be very confusing about this all, and I am sure a lot of people will just install every package and will just hope that it does something, but I would like to actually understand it, so sorry for the ML noise, I would like some clarifications from those who are knowledgeable about this topic.
So reason I got a new Laptop with alder lake, and I am trying my best to optimise it, I dropped 70% battery in just 2 hours when its a 73wh battery, that is absurd, so I checked the two following which is recommended to improve performance and battery life:
1. 3D acceleration is in use 2. Hardware video acceleration is in use
As for 1, it was by default, however the ArchWiki on the xorg page (https://wiki.archlinux.org/title/Xorg#Driver_installation) lists the following driver to be installed for xorg:
xf86-video-intel
Actually, xf86-video-intel is not really recommended for most people: https://wiki.archlinux.org/title/Intel_graphics#Installation Make sure Early KMS is enabled. https://wiki.archlinux.org/title/Kernel_mode_setting#Early_KMS_start
Now running glxinfo before the installing of said package, direct rendering is enabled (DRI), which would indicate that 3D acceleration is working as needed.
So I am a little confused on why it works without having the xorg driver installed? I have the xorg metapackage installed (I am lazy, and the xorg utilities are useful).
is xf86-video-intel used to enable DRI3? Is DRI2 possible without said driver?
The wiki is really not clear about this, and I know a lot of people probably wonder why it matters if 3D acceleration is already enabled, but I would like to know what is going on :)
Also, saying "Just use wayland" is not a valid answer, already had this happen before so just putting it out there.
I would appreciate a detailed explanation on what is going on to clear the confusion.
Now back to point 2, this was not enabled, VAAPI is the Intel API for hardware video acceleration and after installing the required driver (intel-media-driver) vainfo (from libva-utils) successfully shows support for Hardware video acceleration.
If anyone else has any suggestions or experience with optimising alder lake mobile chips on Arch Linux, I would also appreciate the pointers.
As a small note, I used AMD (ryzen 5 5500U) with my old laptop, I can not remember if I could be bothered to setup hardware video acceleration or going through optimising the integrated graphics, I ran that Arch Install for 2 years and typical me doesn't document any of my configuration or dotfiles, so I am trying to piece distant memories back together, lucky the ArchWiki is a good refresher :P
Thank you for your time, -- Polarian GPG signature: 0770E5312238C760 Website: https://polarian.dev JID/XMPP: polarian@polarian.dev
The main reason I switched to Arch a couple months ago was a) learning linux systems, b) optimizing my laptops battery. Here's what I've done to get my battery life from 1.5hrs on Windows to 9+ hrs on Arch (65Wh capacity) with my Razer Blade 2021 (RTX 3070) * Disable the GPU. This is the single most important step assuming you're not gaming while on battery. I've tried a lot of different methods, the only one that worked for my machine was Envy Control: https://aur.archlinux.org/packages/envycontrol This allows me to toggle my gpu off entirely, and if I want to game I'll turn it back into hybrid/nvidia mode and then in the bios I'll enable the dGPU only mode that my bios supports. It's a little clunky getting the GPU back on, but it's the best solution I've found and still gives me more control than Windows. * Setup auto-cpufreq, this will automatically configure the cpu into battery saving modes while on idle and performance modes when plugged in. It works well out of the box, but you can configure it futher if you wish such as disabling turbo mode and lowering max clock. * Powertop --autotune. Powertop is great for both monitoring your power consumption (I've got mine down to ~7W under light load, it was 15-25W on windows), do note that in order to use anything other than the total system power draw you need to run powertop's calibration. Also note that it may cause some issues if you enable power saving on everything, e.g. if I run auto-tune I have to go disable the power saving on my laptop's keyboard so that I don't have to wake the keyboard up everytime I want to start typing. These three things are really the meat of it. Best of luck On Thu, Aug 10, 2023 at 3:44 AM eNV25 <env252525@gmail.com> wrote:
On Thu, Aug 10, 2023 at 3:12 AM Polarian <polarian@polarian.dev> wrote:
Hello,
So the ArchWiki can be very confusing about this all, and I am sure a lot of people will just install every package and will just hope that it does something, but I would like to actually understand it, so sorry for the ML noise, I would like some clarifications from those who are knowledgeable about this topic.
So reason I got a new Laptop with alder lake, and I am trying my best to optimise it, I dropped 70% battery in just 2 hours when its a 73wh battery, that is absurd, so I checked the two following which is recommended to improve performance and battery life:
1. 3D acceleration is in use 2. Hardware video acceleration is in use
As for 1, it was by default, however the ArchWiki on the xorg page (https://wiki.archlinux.org/title/Xorg#Driver_installation) lists the following driver to be installed for xorg:
xf86-video-intel
Actually, xf86-video-intel is not really recommended for most people: https://wiki.archlinux.org/title/Intel_graphics#Installation
Make sure Early KMS is enabled. https://wiki.archlinux.org/title/Kernel_mode_setting#Early_KMS_start
Now running glxinfo before the installing of said package, direct rendering is enabled (DRI), which would indicate that 3D acceleration is working as needed.
So I am a little confused on why it works without having the xorg driver installed? I have the xorg metapackage installed (I am lazy, and the xorg utilities are useful).
is xf86-video-intel used to enable DRI3? Is DRI2 possible without said driver?
The wiki is really not clear about this, and I know a lot of people probably wonder why it matters if 3D acceleration is already enabled, but I would like to know what is going on :)
Also, saying "Just use wayland" is not a valid answer, already had this happen before so just putting it out there.
I would appreciate a detailed explanation on what is going on to clear the confusion.
Now back to point 2, this was not enabled, VAAPI is the Intel API for hardware video acceleration and after installing the required driver (intel-media-driver) vainfo (from libva-utils) successfully shows support for Hardware video acceleration.
If anyone else has any suggestions or experience with optimising alder lake mobile chips on Arch Linux, I would also appreciate the pointers.
As a small note, I used AMD (ryzen 5 5500U) with my old laptop, I can not remember if I could be bothered to setup hardware video acceleration or going through optimising the integrated graphics, I ran that Arch Install for 2 years and typical me doesn't document any of my configuration or dotfiles, so I am trying to piece distant memories back together, lucky the ArchWiki is a good refresher :P
Thank you for your time, -- Polarian GPG signature: 0770E5312238C760 Website: https://polarian.dev JID/XMPP: polarian@polarian.dev
-- Nicolas Strike
Hello,
* Disable the GPU. This is the single most important step assuming you're not gaming while on battery. I've tried a lot of different methods, the only one that worked for my machine was Envy Control: https://aur.archlinux.org/packages/envycontrol
Or you know, you could just blacklist the nvidia module from being loaded, hardware is useless without the kernel module for it, good way to disable microphone and camera too. Also I do have to question why you buy a laptop with a dedicated graphics card just to disable it and never to use it, you could have gotten something with the same build quality and without the dedicated graphics card, and maybe a bigger battery, it would have fitted your workflow better but meh. Also, this step will be useless for me when I specifically picked a laptop without nvidia sh*t, hence why I have only spoke about the intel integrated graphics :)
This allows me to toggle my gpu off entirely, and if I want to game I'll turn it back into hybrid/nvidia mode and then in the bios I'll enable the dGPU only mode that my bios supports. It's a little clunky getting the GPU back on, but it's the best solution I've found and still gives me more control than Windows.
As a note, the hybrid mode support on Linux is horrific, I have seen so many cases where it could go wrong, hell it doesn't even work on windows very well, there are tons of times windows will use the integrated graphics for games even though its meant to use the dedicated. Rule of thumb is pick one or the other and it should be all good :P You can always made a script which can modprobe the nvidia module, unloading it and loading it again. There is no chance of it being active when the kernel doesn't even know of it :P That would be what I would have done, but envycontrol looks like a much more user friendly way of doing this :)
* Setup auto-cpufreq, this will automatically configure the cpu into battery saving modes while on idle and performance modes when plugged in. It works well out of the box, but you can configure it futher if you wish such as disabling turbo mode and lowering max clock.
As far as I am aware auto-cpufreq does nothing more than change the governor and cpu clock speeds based on whether the battery is plugged in or not. I assume this is just a more user friendly way instead of having to stick udev rules in to detect the connection of a charger. At the end of the day you could just set the governor to conservative, tends to be a good middleground, it clocks up a lot slower and therefore draws a lot less power, however when you smash your laptop with a insentive load it can take a while for the cpu to speed up. It is a good opinion, but I do not think my current problem is due to cpu clock speeds, they are already low.
* Powertop --autotune. Powertop is great for both monitoring your power consumption (I've got mine down to ~7W under light load, it was 15-25W on windows), do note that in order to use anything other than the total system power draw you need to run powertop's calibration. Also note that it may cause some issues if you enable power saving on everything, e.g. if I run auto-tune I have to go disable the power saving on my laptop's keyboard so that I don't have to wake the keyboard up everytime I want to start typing.
This is very useful in general, I believe you can toggle the settings within the TUI. I have already ran this last night and it has made marginal improvements which is nice. I am going to let the current settings go through a few charge cycles and see if I am still having a good 30wh drop in less than an hour. Thanks for the advice, -- Polarian GPG signature: 0770E5312238C760 Website: https://polarian.dev JID/XMPP: polarian@polarian.dev
participants (4)
-
Carl Lei
-
eNV25
-
Nic Strike
-
Polarian