Base64 vs Hex Encoding: Which Should You Use and When?
When you need to represent binary data as printable text, two encodings dominate the landscape: base64 and hex. Understanding base64 vs hex encoding is not just academic — choosing the wrong one ca...

Source: DEV Community
When you need to represent binary data as printable text, two encodings dominate the landscape: base64 and hex. Understanding base64 vs hex encoding is not just academic — choosing the wrong one can bloat your payloads, break your URLs, or make cryptographic output harder to work with downstream. This guide explains exactly how each encoding works, quantifies the size trade-offs, and maps out the real-world scenarios where each one shines. How Hex Encoding Works Hex encoding (also called Base16) converts every byte into exactly two hexadecimal characters from the alphabet 0–9 and a–f. Since a byte holds 8 bits and a hex digit holds 4 bits, the math is simple: one byte always becomes two characters. // Node.js const buf = Buffer.from('Hi'); console.log(buf.toString('hex')); // '4869' // Browser function toHex(str) { return Array.from(new TextEncoder().encode(str)) .map(b => b.toString(16).padStart(2, '0')) .join(''); } console.log(toHex('Hi')); // '4869' The resulting string is 100%