PDA

View Full Version : When did we start referring to Megabytes as "Mebibytes"?



blueturtl
February 26th, 2010, 04:46 PM
When I was taught the bit majicks they used to refer to bits and bytes like this:

Small b letter signifies bit, capital letter B signifies byte and thus:

MB/mB = megabyte (or 1024 kB or 1048576 B)
Mb/mb = megabit (or 1000 kb or 1000000 b)

Lot's of programs including many bittorrent clients and for example the Ubuntu disk utility are now referring to MB as MiB or Mebibyte (according to Wikipedia).

Why this sudden change in notation? Looking out for the misuse of MB or GB was bad enough, now they've got a third way to mark these units.

I realize a lot of people probably felt they were / are getting ripped off because service providers and memory manufacturers often misused the old MB (they used bit-logic instead of byte-logic). For example my new terabyte hard drive is really a 1 000 000 000 000 byte hard drive (which in the real world is roughly about 931 gigabytes). That's 70 gigs of bogus capacity right there.

So instead of of making them use the correct notation or ship products that match the product description we just invented "mebibyte"?

What do you guys think?

matchett808
February 26th, 2010, 04:49 PM
The mebibyte is a multiple of the unit byte (http://en.wikipedia.org/wiki/Byte) for quantities of digital information (http://en.wikipedia.org/wiki/Information). The binary prefix (http://en.wikipedia.org/wiki/Binary_prefix) mebi (http://en.wikipedia.org/wiki/Mebi-) means 220, therefore 1 mebibyte is 1048576bytes. The unit symbol for the mebibyte is MiB.[1] (http://en.wikipedia.org/wiki/Mebibyte#cite_note-0) The unit was established by the International Electrotechnical Commission (http://en.wikipedia.org/wiki/International_Electrotechnical_Commission) (IEC) in 2000 and has been accepted for use by all major standards organizations. It was designed to replace the megabyte (http://en.wikipedia.org/wiki/Megabyte) used in some computer science contexts to mean 220 bytes, which conflicts with the SI definition of the prefix mega (http://en.wikipedia.org/wiki/Mega-).


found on wikipedia..........first paragraph on teh page....

Cabs21
February 26th, 2010, 04:53 PM
The mebibyte is a multiple of the unit byte (http://en.wikipedia.org/wiki/Byte) for quantities of digital information (http://en.wikipedia.org/wiki/Information). The binary prefix (http://en.wikipedia.org/wiki/Binary_prefix) mebi (http://en.wikipedia.org/wiki/Mebi-) means 220, therefore 1 mebibyte is 1048576bytes. The unit symbol for the mebibyte is MiB.[1] (http://en.wikipedia.org/wiki/Mebibyte#cite_note-0) The unit was established by the International Electrotechnical Commission (http://en.wikipedia.org/wiki/International_Electrotechnical_Commission) (IEC) in 2000 and has been accepted for use by all major standards organizations. It was designed to replace the megabyte (http://en.wikipedia.org/wiki/Megabyte) used in some computer science contexts to mean 220 bytes, which conflicts with the SI definition of the prefix mega (http://en.wikipedia.org/wiki/Mega-).
found on wikipedia..........first paragraph on teh page....


+1.....NAH +10 very nice answer and very informative. I would have just put a link to Wikipedia and said read this. way to go above and beyond.=D>

Maheriano
February 26th, 2010, 05:02 PM
Ya, mebibytes are denoted by MiB so there's no confusion. I never knew about it neither until I met with a client at work and they asked if the software we write is going to display file sizes in MB or MiB. I was thinking, "WTF are you talking about?"

Basically:
megabyte = 1000 bytes
mebibyte = 1024 bytes

So everyone now be using mebibyte when talking about binary values (2^x) and megabyte if they're simply referring to 1000 block chunks.

rottentree
February 26th, 2010, 05:11 PM
Also:
http://en.wikipedia.org/wiki/Megabit



So everyone now be using mebibyte when talking about binary values (2^x) and megabyte if they're simply referring to 1000 block chunks.

I've never heard people say Mebibyte excluding teachers explaining the difference.
Most people say Megabyte when they are talking about Mebibytes even though they know the difference.

blueturtl
February 26th, 2010, 05:55 PM
Pluto is not a planet any more and a megabyte is now the same as the old megabit. (edit: as in a round number).
They have no right to go changing this stuff. :D

I will have to update all my teaching materials.

ratcheer
February 26th, 2010, 06:48 PM
Pluto is not a planet any more and a megabyte is now the same as the old megabit.

They have no right to go changing this stuff. :D

I will have to update all my teaching materials.

I agree with you, blue turtle. If the terms had not been misused in the first place, they wouldn't have had to invent a stupid new term.

Tim

koleoptero
February 26th, 2010, 07:09 PM
There's so much misinformation in this thread I couldn't resist.

Do you even know the difference between a bit and a byte?
You have no idea where the lost gigabytes in the hard drives go, do you?

My goodness...

V for Vincent
February 26th, 2010, 07:13 PM
For example my new terabyte hard drive is really a terabit hard drive

now *that* would be a total rip. a terabit would be one eighth of a terabyte (if terabyte is taken to mean 10^12 bytes and not 2^40). you're right that it is confusing, though. I just use the traditional prepositions and mention what I take them to mean if it makes a significant difference.

blur xc
February 26th, 2010, 07:14 PM
I agree with you, blue turtle. If the terms had not been misused in the first place, they wouldn't have had to invent a stupid new term.

Tim

This is an interesting precedent... So- what's the next term we should all misuse to the point they invent another dumb word that means the same thing as the misused term used to mean before it was misused to the point of needing a new word to describe it?

:p

I think that makes sense...

BM

koleoptero
February 26th, 2010, 07:36 PM
now *that* would be a total rip. a terabit would be one eighth of a terabyte (if terabyte is taken to mean 10^12 bytes and not 2^40). you're right that it is confusing, though. I just use the traditional prepositions and mention what I take them to mean if it makes a significant difference.

Thank god someone knows what a bit and a byte are.

blueturtl
February 26th, 2010, 07:38 PM
There's so much misinformation in this thread I couldn't resist.

Do you even know the difference between a bit and a byte?
You have no idea where the lost gigabytes in the hard drives go, do you?

My goodness...


now *that* would be a total rip. a terabit would be one eighth of a terabyte (if terabyte is taken to mean 10^12 bytes and not 2^40). you're right that it is confusing, though. I just use the traditional prepositions and mention what I take them to mean if it makes a significant difference.

I've got to clear my name now (as you might have noticed I did some editing on my initial post while you were posting your replies).

I do in fact know the difference between bit and byte. A byte is 8 bits. What I meant (don't post when tired kids) was that they round up the numbers in the same way they would megabits or kilobits, but that they sell gigabytes or megabytes.

I'm under the impression a mebibyte is what megabyte used to mean, or has the megabyte always been a nice round million bytes?

Why use many different ways to measure up files and capacities?! It's driving me crazy especially when they use abbreviations like M for mega: which mega are they referring to now.. a megabit, a begabyte or mebibyte?

ratcheer
February 26th, 2010, 08:48 PM
This is an interesting precedent... So- what's the next term we should all misuse to the point they invent another dumb word that means the same thing as the misused term used to mean before it was misused to the point of needing a new word to describe it?

:p

I think that makes sense...

BM

Yes, perfect sense.

Tim

hobo14
February 27th, 2010, 01:14 AM
I agree with you, blue turtle. If the terms had not been misused in the first place, they wouldn't have had to invent a stupid new term.

Tim

I agree too. Mebibytes is an absurd term that I don't plan on using.
Even though "megabyte"(etc) uses the SI prefix "mega", "megabyte" was never intended to precisely conform to SI standards.

This situation only arose because harddisk manufacturers gave "megabyte" a second, truly SI meaning, to widen their margins.

GeoMX
August 10th, 2010, 04:06 PM
This situation only arose because harddisk manufacturers gave "megabyte" a second, truly SI meaning, to widen their margins.
Publicity is everywhere.

Anyway, I think it is nice that now you can tell for sure what the capacity of a HD is ;)

EoByte
October 1st, 2010, 02:38 PM
I really like the idea of Mebibytes. Yes, it sounds weird right now, but it's always the same and can always be interpreted as 2^20. The way vendors manipulate the meaning of the word Megabyte has always bothered me. Mebibyte is consistent and can't be interpreted to mean anything else. I, for one, intend to use it.

Also, I noticed that Google is using it in its Android development SKD. That should have some clout. Only time will tell if it will stick, but the more we use it, the more likely it will stick.

Eo

forrestcupp
October 1st, 2010, 05:22 PM
No one selling a product will ever use the term MiB because they want to fool you into thinking you are getting more than you actually are.

endotherm
October 1st, 2010, 05:25 PM
IIRC the term was coined after lawsuits against drive manufactures over the discrepancy were determined to be infeasible (since technically, per literal definition they are correct, albeit somewhat misleading to the consumer).

this is what we get for trying to shoehorn a base-2 set of concepts into a metric system that is conceptually founded on base-10 concepts.

ltpriest
October 4th, 2010, 11:03 PM
Is this every confusing!!

Just some stuff I learned back in the stone age...

10^0 = 1 Unit – unity
10^1 = 10 – Deca Unit
10^2 = 100 – Centa Unit
10^3 = 1,000 – Kilo Unit

10^6 = 1,000,000 – Mega Unit

10^9 = 1,000,000,000 – Giga Unit
etc.

However in computer geek speak the binary or 2 rules
1 – bit
4 – nibble
8 – Byte
16 – word
32 – double word
etc.

further
1024 bit == 1 kbit or 1 kb
1,048,576 bit == 1 mbit or 1 mb
etc.

just a byte further...

as noted above:
8 bit == 1 Byte
1024 Byte == 1 kiloByte or 1 kB
1,048,576 Byte == 1 megaByte or 1 mB

There is a reason the letters b, k, m, g, t or B, K, M, G, T are used is that they denote different things.

Hence if some entity says the have a 1 MB drive it will have 10^6 Bytes or 1,000,000 Bytes or one MegaByte or 1 MB
not the popular misconception of 1,048,576 Bytes == 1 megaByte or 1 mB

Now losing 48,576 Bytes for every MB may not seem like much, but, here are 1,073,741,824 Bytes in a gB so you lose/gain 73,741,824 Bytes if you mix up with GB.

The tera/Tera is even better.


Thanks,
P.P. found this http://www.iec.ch/zone/si/si_bytes.htm
so I won't quibble about the mible's anymore...

NovaAesa
October 4th, 2010, 11:51 PM
I do in fact know the difference between bit and byte. A byte is 8 bits.

I'm not sure where you are getting your sources from, but this is incorrect. The number of bits in a byte is dependent on the computer architecture, it's the number of bits required to encode one character (and is sometimes, but not always the size of the smallest addressable parts of memory). While today, the majority of architectures have an 8 bit byte, this isn't always the case, e.g. many earlier architectures from the 60s had a 4, 5, or 6 bit byte.

t0p
October 5th, 2010, 12:06 AM
Ubuntu's "System Monitor also uses these "mebibyte" things. Take a look at "Resources" and you'll see that "Memory and Swap History" and "Network History" are both reported in MiB units. Also "Processes" and "File Systems".

But when are we, aka "the users" going to start using mebibytes etc? It seems to me that folk still talk in terms of "megs" and "gigs". The Fujitsu eternal hard drive I bought last year has "400 GB" on the box. So what kind of standardisation is this?

endotherm
October 5th, 2010, 12:18 AM
Ubuntu's "System Monitor also uses these "mebibyte" things. Take a look at "Resources" and you'll see that "Memory and Swap History" and "Network History" are both reported in MiB units. Also "Processes" and "File Systems".

But when are we, aka "the users" going to start using mebibytes etc? It seems to me that folk still talk in terms of "megs" and "gigs". The Fujitsu eternal hard drive I bought last year has "400 GB" on the box. So what kind of standardisation is this?
on the user end, the distinction is far less meaningful. its only when comparing data size to marketed disk capacity that it gets meaning, and at present storage volume sizes, the difference can be considered negligible (well, 69GB loss on a TB drive is a bit irking...). once we get to peta/exo/zettabytes however, the market (or should I say, the marketing) will have to correct itself, as the two measurements won't be even similar.

in general, the only time megs/gigs are a lie, are when they are printed on the HDD's box.

3rdalbum
October 5th, 2010, 01:46 AM
I want to know why data transfer speeds are measured in bits per second, and sizes are measured in bytes.

After all, it's not like a car's speedometer goes in metres per second and the odometer goes in miles.

endotherm
October 5th, 2010, 03:04 AM
I want to know why data transfer speeds are measured in bits per second, and sizes are measured in bytes.

After all, it's not like a car's speedometer goes in metres per second and the odometer goes in miles.
I think they defined that in the old days (1980 IEEE committee 802)when 10kbit sounded better than 1.2kByte.

kidding aside, in networking, you are worried about throughput, so you measure what matters to you, the transmission of each bit in a stream. since platforms determine what a "byte" is (as another poster mentioned, there have been plenty of platforms that used bytes of less than 8 bits), and networks are platform neutral, it makes sense to break it down to the smallest universal denominator.

with programming, you are trying to give binary meaning, so you are worried about things like character encoding formats. networks don't care about encoding, but apps sure do. the ascii character set is defined so that each character is represented by a number between 0 and 255, so 8 bits is great for an ascii system. 8 bits can be expressed in 4 digits in octal and 2 digits of Hexadecimal,, so its a good choice from a mathematical expression standpoint as well.

Dustin2128
October 5th, 2010, 05:49 AM
Pluto is not a planet any more and a megabyte is now the same as the old megabit. (edit: as in a round number).
They have no right to go changing this stuff. :D

I will have to update all my teaching materials.
That's not all, they've changed plenty of medical terminology too. I think I see a textbook industry plan in the making...

v1ad
October 5th, 2010, 05:56 AM
it actually was brought into play with the ISP's they realized they can sell internet to people for more when they say 10 megabits instead of 1.2 megabytes.

lisati
October 5th, 2010, 05:57 AM
Basically:
megabyte = 1000 bytes
mebibyte = 1024 bytes

Huh? I've alwasys used kilo.... for 1000/1024 :D

It's only recently that I've become aware of the "mebi" prefix. The convention used by the programmers I hung out with when working in the field (back in the 1980s) was binary-based for computer stuff, even though we "knew" it was wrong, and the "proper" decimal based for everything else.

MasterNetra
October 5th, 2010, 06:26 AM
Ya, mebibytes are denoted by MiB so there's no confusion. I never knew about it neither until I met with a client at work and they asked if the software we write is going to display file sizes in MB or MiB. I was thinking, "WTF are you talking about?"

Basically:
megabyte = 1000 bytes
mebibyte = 1024 bytes

So everyone now be using mebibyte when talking about binary values (2^x) and megabyte if they're simply referring to 1000 block chunks.

actually a Megabyte is 1000kb (kilobytes which kilobyte is 1000 bytes) this whole MiB is weird.

lisati
October 5th, 2010, 08:22 AM
Thread title edited :D

blueturtl
October 5th, 2010, 08:27 AM
I'm not sure where you are getting your sources from, but this is incorrect. The number of bits in a byte is dependent on the computer architecture, it's the number of bits required to encode one character (and is sometimes, but not always the size of the smallest addressable parts of memory). While today, the majority of architectures have an 8 bit byte, this isn't always the case, e.g. many earlier architectures from the 60s had a 4, 5, or 6 bit byte.

Thank you for pointing that out. In the context of this thread one could assume I was talking about the x86 architecture, though. ;) This thread is turning out so embarrassing for me. All I meant to do was to point out there is a discrepancy between the measuring units used in marketing and actual measurement in computer systems and that now instead of one or the other adjusting a new term was invented.

Instead I end up cleaning after a very poorly worded first posting. :oops:


Thread title edited :D

Thank you Iisati, you are kind. :)

koleoptero
October 5th, 2010, 11:48 AM
I'm not sure where you are getting your sources from, but this is incorrect. The number of bits in a byte is dependent on the computer architecture, it's the number of bits required to encode one character (and is sometimes, but not always the size of the smallest addressable parts of memory). While today, the majority of architectures have an 8 bit byte, this isn't always the case, e.g. many earlier architectures from the 60s had a 4, 5, or 6 bit byte.

A byte is still 8 bits, we just have multibyte characters ;)

Hyporeal
October 5th, 2010, 06:53 PM
Ubuntu's "System Monitor also uses these "mebibyte" things. Take a look at "Resources" and you'll see that "Memory and Swap History" and "Network History" are both reported in MiB units. Also "Processes" and "File Systems".

But when are we, aka "the users" going to start using mebibytes etc? It seems to me that folk still talk in terms of "megs" and "gigs". The Fujitsu eternal hard drive I bought last year has "400 GB" on the box. So what kind of standardisation is this?

I would argue that users should use MiB and GiB rarely (if ever). The notation represents facts about computer architecture that are not relevant to typical users. However, using GiB and GB correctly is absolutely better than the old practice of ambiguously using GB to refer to both GiB and GB. So as long as the Ubuntu System Monitor and your hard drive use their terms correctly, we're already far better off than we were before.

endotherm
October 5th, 2010, 07:14 PM
A byte is still 8 bits, we just have multibyte characters ;)
nowadays, yes we have 2/4B unicode characters, but in the old days, many purists believed that ASCII was bloated, and EBCIDIC was too Extended, so some systems used 4 or 6 bit bytes.
https://secure.wikimedia.org/wikipedia/en/wiki/Byte
see first paragraph.