PC Ekspert Forum

PC Ekspert Forum (https://forum.pcekspert.com/index.php)
-   NVIDIA (https://forum.pcekspert.com/forumdisplay.php?f=7)
-   -   Nvidia GeForce 6600GT (https://forum.pcekspert.com/showthread.php?t=177758)

Emissary 01.07.2006. 14:11

Citiraj:

Autor Panzer
Ja sam imao VF700 Al-Cu na svojoj 6800LE, nisam imao jezice na memoriji, i nije prelazila 58° load... oko 50-52°C idle...
Razlika izmedu Cu i Al-Cu nije velika, nekolko °C mozda... ;)
Ja si mozda sad piknem VF900, ali mi je Faceless rekao da dosta savine grafu... Kad san imao VF700 montiran, grafa uopce nije bila savinuta...

vf900cu - 185g
vf700alcu - 180g
vf700cu - 270g

@crash - odlično, meni je sad u idleu minimalno 47, a u loadu 80, grafa na default (club3d) :cool:

btw zna ko šta znači Core VID u Hardware Monitoringu Riva Tunera?

Razer 01.07.2006. 14:14

Citiraj:

Autor Emissary
vf900cu - 185g
vf700alcu - 180g
vf700cu - 270g

@crash - odlično, meni je sad u idleu minimalno 47, a u loadu 80, grafa na default (club3d) :cool:

btw zna ko šta znači Core VID u Hardware Monitoringu Riva Tunera?

Aj stavi screeenshot temp monitoringa iz rive...mislim da je to voltaža jezgre...MISLIM

Emissary 01.07.2006. 14:26

Sumnjam :)

http://img434.imageshack.us/img434/1...rd014zk.th.jpg

Razer 01.07.2006. 14:30

Citiraj:

Autor Emissary

LOL...mislim da i ja sumnjam...bilo bi veselo da je to voltaža jezgre :D

Emissary 01.07.2006. 14:36

Ne znaš je li veselije kad je 0 ili kad je 4 :D

A vidi šta documentation kaže:

Q: Why does RivaTuner monitor abnormally
low 1.0V core voltage on my GeForce 6800? Can you fix this bug?

A: You should look at "Core VID"
graph's X-axis dimension carefully. I bet that if you'll do it, you will not
find "Volts" there. Quite opposite, you'll see that the value displayed
on this graph is non-dimensional. The graph you are looking at is a raw untransformed
VID data. Refer to the questions discussed above to understand what is VID and
to find instructions on enabling RivaTuner's voltage interpretation function,
allowing you to see target voltages on the graph instead of raw VID data.



E sad, ajmo elektrotehničari :D

Emissary 01.07.2006. 14:45

A evo i jednostavnog pojašnjenja:

Q: There are a lot of rumors
about
GeForceFX software voldmods on the net. I've seen some online BIOS voltmodding
tutorials and even volmodded BIOSes available for download. Can you comment
it?

A: Yes, GPU core voltage is really software
controllable on GeForce FX graphics processors. Unfortunately, all online BIOS
voltmodding guides I've seen seem to be written using blind comparison of different
BIOS binaries without actual understanding of software voltage control internals.
So they contain some logical errors. The same applies to some voltmodded BIOSes
available for download on some websites.


To understand internals of software voltage control, let's start from the very
beginning. NVIDIA boards have some GPU controllable GPIO (General Purpose Input
Output) pins, which are used for different purposes. Up to three of these pins
can be used to control core voltage on GeForce FX based boards. The states of
this pins form binary word (up to three bits width), which uniquely identify
target core voltage. This word is called VID, or voltage identifier. So to program
desired core voltage driver simply sets each pin to the corresponding state
via the corresponding GPIO register. But VID interpretation entirely depends
on the PCB's core voltage generation logic, for example most of NV35/38 boards
control core voltage via ISL6569 IC, where its' VID0 and VID1 input pins are
hardwired to 0 / 1, and VID2 - VID4 pins are programmable by GPU. So core voltage
on these boards can be adjusted in 0.8 - 1.5V range with 0.1V granularity and
all three GPIO pins are used. Other boards may have (and do have) simpler voltage
control logic (e.g. simplest 1-bit VID selecting one of 2 predefined voltages).
As I've said before, VID interpretation may differ depending on the PCB design,
and driver knows nothing about it. To allow hardware vendors to alter voltage
control logic safely, NVIDIA introduced so called voltage tables in BIOS with
BMP structure version 5.25 and newer. For older BIOSes driver uses its' own
GPU-specific internal voltage table. Voltage table begins from the header, containing
total amount of voltage entries, size of each entry and valid VID bitmask. The
last field is the most important, because it 'tells' the driver which pins actually
control the voltage. For example, nobody prevents hardware vendor from using
2-bit VID defined by pin 0 and pin 2. In this case VID bitmask will contain
101b. Take a note, that the driver will never program masked pins. Array of
voltage table entries follows by the header. Each voltage table entry contains
target voltage identifier (target voltage (in Volts) * 100) and VID defining
this voltage. The first element of each entry (i.e. target voltage identifier)
is used just to allow the driver to pick the corresponding VID from the table
(because the driver knows nothing about VID, it knows just the target voltage
picked from the corresponding performance level entry in the performance table).
So when programming the voltage, the driver simply picks required voltage entry
from the table by scanning all voltage table entries, comparing target voltage
identifier with voltage identifier of each entry and selecting the closest entry.
When the entry is selected, the driver disassembles VID on separate bits, and
programs each non-masked bit via the corresponding GPIO register.

If you've read all this info carefully, you may already see logical errors and
potential problems in currently walking voltmodded BIOS yourself:

First, it's plain wrong to voltmod BIOS by copying 1.5V VID from NV38's voltage
table to all other BIOSes without seeing the PCB and its' voltage control logic
as it is advised in BIOS voltmod tutorials. VIDs do not have to be the same
on all boards.

Second, it's wrong to ignore VID bitmask and to edit voltage table entry's VID
only. As an example, let's take a board with the following 2-bit VID:

00 -> 1.1V, 01 -> 1.2V, 02 -> 1.3V and 03 -> 1.4V. Attempt to boost
voltage by increasing VID to 4 will actually lower voltage and result in setting
1.1V (4 & 3 = 0). Attempt to boost voltage by copying NV38's 1.5V VID (7)
will simply do nothing (7 & 3 = 3). The same attempt on the board with different
2-bit VID interpretation (e.g. 01 -> 1.4V, 02 -> 1.3V, 03 -> 1.2V)
will also lower voltage and set it to 1.2V. So if you can actually see the PCB
and are sure that there are more than 2 bits in VID - you've to change VID mask
too. Otherwise, you simply shouldn't touch it.


To help you to see if your voltmodded BIOS really affects VIDs, RivaTuner gives
you an ability to monitor state of voltage related GPIO pins in realtime, so
you may see which VID is currently programmed by the driver. Using RivaTuner's
VID interpretation feature you may also see both raw VID data and target voltage
corresponding to this VID (to select VID interpretation mode right-click VID
graph in the hardware monitoring window, select Setup from the context menu
and press More button). Furthermore, RivaTuner's diagnostic report module allows
you to see internals of voltage table stored in VGA BIOS and warns you if there
are some entries with invalid VIDs, which don't conform to VID bitmask.


Q: If it is possible to program VID
pins, will RivaTuner provide us an ability to adjust GPU core voltage on-the-fly
for GeForceFX display adapters?

A: No, sorry. I'll never add software
voltage adjustment to RivaTuner as well as I'll never provide info about the
GPU registers controlling GPIO VID pins to third party tools creators. I don't
want to be related to development of the tool responsible for burning someone's
system, and direct voltage control via Windows utility is one on the things
that can help beginners to fry their GPUs.


Razer 01.07.2006. 14:51

Jednostavnije pojašnjenje :D

BlackDwarf 01.07.2006. 19:44

Ja nasao nesto zanimljivo pa da pitam kako vam se cini??

Citiraj:

VGA Galaxy AGP nVidia GF6600 256MB Zalman cooler

Specifications:
FEATURE GEFORCE 6600 AGP
Graphic Core
256-bit
Fill Rate (billion texels/sec.) 2.4
Vertices/sec. (million) 225
Graphics Bus Technology AGP
Pixels Per Clock (peak) 8
Memory DDR
RAMDAC 400 MHz

755.99kn

Techno 06.07.2006. 11:21

imam jedan problem.imam grafu iz siga i u 3d mark 01 SE mi izbaci score od 10k.zasto?dok u 3d mark 06 dobijem oko 1400 sto je ok.
jel moze biti moguce da u marku01 mi baci tak mali score zato sto taj program nepodrzava vertex & pixel shrader 3, tj. radi na starim tehnologijama??????

MAHER 06.07.2006. 17:31

ne brini meni 3dmark 01 izbaci 12k,a skoro stalno je fps 200,možda zato što i nekog razloga je 200 granica(nikada mi ne ide preko),uglavnom neki bug,ne brini

Yaxyo 07.07.2006. 00:08

Evo kupih na kraju VF900...što da kazem osim da sam ocaran: nuts : : goood :
Pala temperatura u idlanju sa 55 (stock) na 34-35 (VF900).
Montaža lagana, ma sve 5 :)
Sad sam bio pokrenuo 3DMarka i nakon zavrsenog testa je temperatura bila 36 stupnjeva :beer:, budem jos pokrenuo par testova pa javim:cool:

Razer 07.07.2006. 00:14

Citiraj:

Autor Yaxyo
Evo kupih na kraju VF900...što da kazem osim da sam ocaran: nuts : : goood :
Pala temperatura u idlanju sa 55 (stock) na 34-35 (VF900).
Montaža lagana, ma sve 5 :)
Sad sam bio pokrenuo 3DMarka i nakon zavrsenog testa je temperatura bila 36 stupnjeva :beer:, budem jos pokrenuo par testova pa javim:cool:

Odličan izbor : goood : ...i još bolji rezultati (temperature)...vidim da uistinu odlično hladi taj vf900...jesi li probao clockati grafu dokle ide...pa da vidiš koliko su onda temp na full loadu?

SRV 07.07.2006. 00:46

Mozda i ja puknem taj VF900 na svoju 7600GT kad je tako dobar :)

Yaxyo 07.07.2006. 00:52

NIsam nista jos clockao jer sam tek dosao doma bio u 23h...od 16h sam vani, isao prvo po Zalmana i onda van sa frendovima, cijeli dan ga nosim :D pa sam se ranije vratio da ga isprobam :)
Očekujte uskoro rezultate

Zvijerko 07.07.2006. 14:12

Ajd super su temperature. Ja sam sebi naručio vf900 i mislio do sad da možda nisam treba, ali sad kad vidim kako ti ohladi nije mi žao. A tek kad odem po njega :D

Yaxyo 07.07.2006. 17:42

Evo, danas sam ju ocao i trenutno radi na 580/1150 (default je 500/110)...isla je i preko toga ali nisam siguran ni kako artefekti izgledaju pa sam je tu ostavio :D
Temperatura je u idlu 34 trenutno, nakon 3 puta 3D Marka je bila 37 :beer:

Techno 07.07.2006. 19:10

Imam pitanje!!
dali VF700 & VF900 odgovaraju za sve grafulje iz serije 6600?????
tj. dali bi ih mogao uglaviti na moju grafulju iz siga???

Crash009 07.07.2006. 19:26

idu bez problema, samo sto neces moci pasivce na memu stavit

Yaxyo 07.07.2006. 19:34

VF900 nece moći, barem ne prema Zalmanu. Piše da ne ide na AGP verzije.

Emissary 07.07.2006. 19:37

2001 dosta ovisi i o ostatku konfe, meni recimo daje nekih 16-ak tisuća.

Emissary 07.07.2006. 19:39

Nemoguće da ti je load temperatura samo toliko veća od idle, upali Hardware Monitor u Riva Tuneru i tamo pogledaj nakon šta ti završi 3dmark.

Yaxyo 07.07.2006. 20:13

Evo bas skidam...btw, sad sam maknuo SkyStar 2 karticu jer mi vec dva tri mjeseca stoji tamo a ne koristim je zbog prekratkog kabla. Uglavnom, sve je bolje i bolje, u idleu je temperatura 32 stupnja :D
Fora je u tome sto se SkyStar jako grije za neupucene

Razer 07.07.2006. 20:14

Citiraj:

Autor Emissary
Nemoguće da ti je load temperatura samo toliko veća od idle, upali Hardware Monitor u Riva Tuneru i tamo pogledaj nakon šta ti završi 3dmark.

I ja isto mislim...kad test završi grafa ...dok 3dmark izračuna rezultat i dok ti odeš vidjeti koliki je score...grafa se gotovo već ohladila...ap su zato i tvoje temperature tako male u loadu
EDIT: @Yaxyo - i ja sam imam skystar 2...i znam da se grije pogotovo ako se nešto skida s njom...treba na čip staviti pasivca i miran si :D

Razer 07.07.2006. 20:47

Citiraj:

Autor MAHER
ne brini meni 3dmark 01 izbaci 12k,a skoro stalno je fps 200,možda zato što i nekog razloga je 200 granica(nikada mi ne ide preko),uglavnom neki bug,ne brini

Jesi li siguran - http://forum.pcekspert.com/attachmen...1&d=1152297968 - 996fpsa u Nature testu :D ...ništa nije nemoguće...nepitajte koji je proc i grafa:D

d0X 07.07.2006. 21:48

bi li mogla to jedna 7900gtx + a64 3700 napravit ili treba još nešto jače?

Zvijerko 07.07.2006. 21:54

Meni taj nature test ide dosta sporo

Razer 07.07.2006. 22:03

Citiraj:

Autor d0X
bi li mogla to jedna 7900gtx + a64 3700 napravit ili treba još nešto jače?

To je bio athlon fx na mislim nekih 3ghz i sli od 7800gtx....

JC Denton 08.07.2006. 09:17

Citiraj:

Autor Techno
imam jedan problem.imam grafu iz siga i u 3d mark 01 SE mi izbaci score od 10k.zasto?dok u 3d mark 06 dobijem oko 1400 sto je ok.
jel moze biti moguce da u marku01 mi baci tak mali score zato sto taj program nepodrzava vertex & pixel shrader 3, tj. radi na starim tehnologijama??????

3Dm01 ti izbaci tako niski score zato sto jako ovisi o ostatku konfe, a pogotovo o procesoru. A vidim da je tvoj p4 Northwood 2.0GHz clockan na 2,4GHz..i jos k tome ako se uzme da je malo starija instalacija windowsa...hm... nista neobicno da imas mali score. No opet, trebalo bi izbaciti ipak malo veci score... no nekako mi smrdi na staru instalaciju Windowsa.. najvjerojatnije

MAHER 08.07.2006. 11:47

da 900fps,ali meni nikada ne ide preko 200,čak niti 201 ne želi potegnuti,a stoji na 200 neko vrijeme,mislim da mi je taj mark01 nekak zbagao

SetH 09.07.2006. 13:01

Albatron 6600GT
 
da li ova grafa ima senzore za temp.? probao sam i sa coolbits i ništa se ne pojavljuje,ni u everestu...:fuming:


Sva vremena su GMT +2. Sada je 23:02.

Powered by vBulletin®
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
© 1999-2024 PC Ekspert - Sva prava pridržana ISSN 1334-2940
Ad Management by RedTyger