The number of bits required to represent a signed integer in a computer system is known as the "word size" or "bit width." It defines the maximum and minimum values that can be stored for a signed integer, as well as the range of numbers that can be represented.