A multimeter is used to measure voltage more often than current because once you measure the voltage across a circuit element, and if you know its internal resistance, you can calculate the current flowing through it using Ohm's Law: Voltage = Resistance X Current (V=RI).
When you want to measure electric current, you need an ammeter, for resistance you need an ohmmeter and for voltage, a voltmeter. A multimeter combines all these functions into one device. It has a dial on the front so you can select the function and sensitivity you need, and an LED screen that displays the readout. To measure voltage across a circuit element, you have to connect the leads of the meter in parallel with the element. In the voltage setting, the meter has a high resistance so very little current runs through it.
Video of the Day
Plug the black lead wire that comes with the multimeter into the socket on the front of the machine marked COM, which stands for "Common". Plug the red lead wire into the socket marked VΩmA. It is usually the middle one. The top socket, marked 10A is rarely used.
Turn the dial to the section the voltage settings are located. The section is usually denoted with the legend V=. There is a selection of sensitivities. Choose the least sensitive millivolt (mV) setting. The least sensitive one will measure the highest number of millivolts.
Touch the red lead to one of the terminals of the circuit element across which you want to know the voltage and the black lead to the other. You can also touch the leads to the wires connected to the terminals.
Turn the dial to the next higher sensitivity if you don't get a reading. If you still don't get a reading, keep turning it until you do. The reading, when you get one, will be in millivolts. No further calibration is required.
Use an autoranging multimeter by turning the dial to volts (V) and touching the leads to the terminals. It will automatically calibrate its own sensitivity and give you a reading.