Difference between revisions of "Endian"

From AMule Project FAQ
Jump to: navigation, search
 
 
(One intermediate revision by the same user not shown)
Line 1: Line 1:
The endianess defines the way data is stored in physical memmory.
+
== Definition ==
  
There are two standars:
+
The endianess defines the way data is stored in physical memory.
 +
 
 +
== Into the issue ==
 +
 
 +
There are two standards:
  
 
*[[Little endian]]
 
*[[Little endian]]
Line 8: Line 12:
 
Both are well-widely used and common.
 
Both are well-widely used and common.
  
Little endian requires the bytes with the numbers with lower weight to be placed first in memory.
+
== Summary ==
 +
 
 +
[[Little endian]] requires the bytes with the numbers with lower weight to be placed first in memory.
 +
 
 +
[[Big endian]] requires the bytes with the numbers with higher weight to be placed first in memory.
 +
 
 +
Other endianness exist (such as middle-endian) but are mostly very old, outdated and non-standard.
 +
 
 +
== Notes ==
  
Big endian requires the bytes with the numbers with higher weight to be placed first in memory.
+
Endianness used to take to think about storing numbers, but since [http://www.unicode.org Unicode]'s rising popularity, the endianness problem is now into text too, since in [http://www.unicode.org Unicode] one single character might require more than just one byte to represent it.

Latest revision as of 21:45, 9 March 2005

Definition

The endianess defines the way data is stored in physical memory.

Into the issue

There are two standards:

Both are well-widely used and common.

Summary

Little endian requires the bytes with the numbers with lower weight to be placed first in memory.

Big endian requires the bytes with the numbers with higher weight to be placed first in memory.

Other endianness exist (such as middle-endian) but are mostly very old, outdated and non-standard.

Notes

Endianness used to take to think about storing numbers, but since Unicode's rising popularity, the endianness problem is now into text too, since in Unicode one single character might require more than just one byte to represent it.