The clock rate typically refers to the frequency at which a chip like a central processing unit (CPU), one core of a multi-core processor is running, and is used as an indicator of the processor’s speed. It is measured in the SI unit hertz. The clock rate of the first generation of computers was measured in hertz or kilohertz, but in the 21st century the speed of modern CPUs is commonly advertised in gigahertz. This metric is most useful when comparing processors within the same family, holding constant other features that may impact performance. Video card and CPU manufacturers commonly select their highest performing units from a manufacturing batch and set their maximum clock rate higher, fetching a premium price.