Results 1 to 4 of 4

Thread: Disable Nvidia Driver's Adaptive Clocking?

  1. #1
    Join Date
    Aug 2011
    Location
    Manila, Philippines
    Beans
    9
    Distro
    Ubuntu 10.10 Maverick Meerkat

    Disable Nvidia Driver's Adaptive Clocking?

    I use NVIDIA GeForce 9500GT 1024MB 128Bit graphics card with Driver Version 295.20 under Ubuntu 10.10 Maverick Meerkat and I have noticed that I cannot disable the adaptive clocking functionality.

    How do I disable the adaptive clocking so that my gpu will not decrease its clock?
    How do I permanently set the Powermizer Settings' Preferred Mode to Prefer Maximum Performance?

    Now, I have skimmed through the forums relating to this and none of the solutions seems to work for me.

    please share your solutions if you have already come across this anomally before.

    What I have tried so far are the following:

    I have added the following lines under Section "Device" in xorg.conf:
    Code:
        Option         "Coolbits" "1"
        Option         "RegistryDwords" "PowerMizerEnable=0x1; PerfLevelSrc=0x3322;
    ^The coolbits command managed to show the overclocking functionality but I do not want to play with that as I had a bad experience with overclocking in the past . I just want to do it safely by simply disabling the adaptive clocking. I have tried just checking the option Enable Overclocking without changing the clocks to see if it will disable the adaptive clocking but it did not. (see screenshot from attached files)

    I have created the /etc/modprobe.d/nvidia.conf file and added this line:
    Code:
    options nvidia NVreg_RegistryDwords="PerfLevelSrc=0x2222"
    ^This did nothing (i think) but it's still in my home directory.

    I also tried to edit the .nvidia-settings-rc and changed:
    Code:
    Timer = PowerMizer_Monitor_(GPU_0),Yes,1000
    to:
    Code:
    Timer = PowerMizer_Monitor_(GPU_0),No,1000
    ^this change just manage to uncheck powermizer from the settings configuration but the adaptive clocking is still there. (see screenshot from attached files)


    here is the full detail of my xorg.conf:
    Code:
    # nvidia-xconfig: X configuration file generated by nvidia-xconfig
    # nvidia-xconfig:  version 295.20  (buildmeister@swio-display-x86-rhel47-05.nvidia.com)  Mon Feb  6 22:13:40 PST 2012
    
    # nvidia-settings: X configuration file generated by nvidia-settings
    # nvidia-settings:  version 295.20  (buildmeister@swio-display-x86-rhel47-05.nvidia.com)  Mon Feb  6 22:13:16 PST 2012
    
    Section "ServerLayout"
        Identifier     "Layout0"
        Screen      0  "Screen0" 0 0
        InputDevice    "Keyboard0" "CoreKeyboard"
        InputDevice    "Mouse0" "CorePointer"
        Option         "Xinerama" "0"
    EndSection
    
    Section "Files"
    EndSection
    
    Section "InputDevice"
    
        # generated from default
        Identifier     "Mouse0"
        Driver         "mouse"
        Option         "Protocol" "auto"
        Option         "Device" "/dev/psaux"
        Option         "Emulate3Buttons" "no"
        Option         "ZAxisMapping" "4 5"
    EndSection
    
    Section "InputDevice"
    
        # generated from default
        Identifier     "Keyboard0"
        Driver         "kbd"
    EndSection
    
    Section "Monitor"
        Identifier     "Monitor0"
        VendorName     "Unknown"
        ModelName      "Chi Mei Optoelectronics corp. A16B1"
        HorizSync       30.0 - 80.0
        VertRefresh     50.0 - 75.0
        Option         "DPMS"
    EndSection
    
    Section "Device"
        Identifier     "Device0"
        Driver         "nvidia"
        VendorName     "NVIDIA Corporation"
        BoardName      "GeForce 9500 GT"
        Option         "Coolbits" "1"
        Option         "RegistryDwords" "PowerMizerEnable=0x1; PerfLevelSrc=0x3322; PowerMizerDefaultAC=0x1"
    EndSection
    
    Section "Screen"
        Identifier     "Screen0"
        Device         "Device0"
        Monitor        "Monitor0"
        DefaultDepth    24
        Option         "TwinView" "0"
        Option         "TwinViewXineramaInfoOrder" "CRT-1"
        Option         "metamodes" "1366x768_60 +0+0"
        SubSection     "Display"
        Depth       24
        EndSubSection
    EndSection
    Here is the full detail of my .nvidia-settings-rc:
    Code:
    #
    # /home/corvinus/.nvidia-settings-rc
    #
    # Configuration file for nvidia-settings - the NVIDIA X Server Settings utility
    # Generated on Wed Feb 29 13:07:29 2012
    #
    
    # ConfigProperties:
    
    RcFileLocale = C
    ToolTips = Yes
    DisplayStatusBar = Yes
    SliderTextEntries = Yes
    IncludeDisplayNameInConfigFile = No
    ShowQuitDialog = Yes
    Timer = Thermal_Monitor_(GPU_0),Yes,1000
    Timer = PowerMizer_Monitor_(GPU_0),No,1000
    
    # Attributes:
    
    0/CursorShadow=0
    0/CursorShadowAlpha=64
    0/CursorShadowRed=0
    0/CursorShadowGreen=0
    0/CursorShadowBlue=0
    0/CursorShadowXOffset=4
    0/CursorShadowYOffset=2
    0/SyncToVBlank=0
    0/LogAniso=0
    0/FSAA=0
    0/TextureSharpen=0
    0/TextureClamping=0
    0/AllowFlipping=0
    0/FSAAAppControlled=1
    0/LogAnisoAppControlled=1
    0/OpenGLImageSettings=0
    0/FSAAAppEnhanced=0
    0/RedBrightness=0.000000
    0/GreenBrightness=0.000000
    0/BlueBrightness=0.000000
    0/RedContrast=0.000000
    0/GreenContrast=0.000000
    0/BlueContrast=0.000000
    0/RedGamma=1.000000
    0/GreenGamma=1.000000
    0/BlueGamma=1.000000
    0/DigitalVibrance[CRT-1]=0
    0/OverscanCompensation[CRT-1]=0
    0/XVideoTextureBrightness=0
    0/XVideoTextureContrast=0
    0/XVideoTextureHue=0
    0/XVideoTextureSaturation=0
    0/XVideoTextureSyncToVBlank=1
    0/XVideoSyncToDisplay=2
    I also attached the screenshots of my NVIDIA X Server Settings windows

  2. #2
    Join Date
    Jun 2007
    Beans
    17,337

    Re: Disable Nvidia Driver's Adaptive Clocking?

    I just use a simple xorg.conf, nothing else, though you should take a read thru here
    http://tutanhamon.com.ua/technovodst...A-UNIX-driver/

    This is what I typically use on this laptop - it provides adaptive on Battery, max on AC which I am almost always on.

    Code:
    Section "Device"
      Identifier "NVIDIA GeForce"
      Driver     "nvidia"
      Option     "RegistryDwords" "PowerMizerEnable=0x1; PerfLevelSrc=0x3322; PowerMizerDefaultAC=0x1"  
    EndSection
    If I wanted max on both Battery & AC then this would do -
    Code:
    Section "Device"
      Identifier "NVIDIA GeForce"
      Driver     "nvidia"
      Option     "RegistryDwords" "PowerMizerEnable=0x1; PerfLevelSrc=0x2222; PowerMizerDefaultAC=0x1"  
    EndSection

  3. #3
    Join Date
    Aug 2011
    Beans
    2

    Smile Re: Disable Nvidia Driver's Adaptive Clocking?

    This latest reply appears to work for me! NVIDIA powermizer stays at level 2! BUT CPU is still nearing maximum after playing flash movies for more than 5 minutes.
    Last edited by mpeetoom; July 2nd, 2012 at 03:40 PM.

  4. #4
    Join Date
    Aug 2011
    Beans
    2

    Angry Re: Disable Nvidia Driver's Adaptive Clocking?

    Quote Originally Posted by mpeetoom View Post
    This latest reply appears to work for me! NVIDIA powermizer stays at level 2! BUT CPU is still nearing maximum after playing flash movies for more than 5 minutes.
    Do I really have to step back to Windows in order to play flash movies???? Please don't let this be true!!

    Even after shutting down the flash movie cpu stays up there.

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •