Basic and Advance Java Question:

How many bits are used to represent Unicode, ASCII, UTF-16, and UTF-8 characters in Java Programming?

Tweet Share WhatsApp

Answer:

Unicode requires 16 bits and ASCII require 7 bits. Although the ASCII character set uses only 7 bits, it is usually represented as 8 bits. UTF-8 represents characters using 8, 16, and 18 bit patterns. UTF-16 uses 16-bit and larger bit patterns.

Download Java PDF Read All 81 Java Questions
Previous QuestionNext Question
What is an Iterator interface in Java Programming?What is the difference between yielding and sleeping in Java Programming?