No no, the mess exists with anyone/anything that says kilobytes when they mean kibibytes. This isn't a popularity contest, there is a clear and logical right and wrong.[/b]
I'll end with saying, the software this forum was created for, disagrees with you.
EDIT: What if hard drive manufacturers weren't the ones that pushed to fix the industry? What if the existing base-2 standard was written without anyone's help? Try and see past your apparent hatred for hard drive manufacturers and reevaluate what you're pushing for.
The mess began when computer scientists decided to make a kilobyte 1024 instead of 1000. I wish they had made a KB = 1000, but they didn't. We can argue their thoughts or feelings but its pointless. A kilobyte was, and also should be 1024. If you don't like it, you should make your own word and call it 1000. Does it suck, heck yes. Is it the right thing to do for backwards compatibility forever, ABSOLUTELY. 20 years from now I don't need some computer science grads debating whether a kilobyte was 1000 or 1024 bytes for a given device and something goes wrong because of their error that results in a loss of life.
The word kilobyte was created to be equal to 1024, and so on. How dare the hard drive industry try to change the rules because it was inconvenient for them. Honestly, it shows that the hard drive industry really wasn't well informed in the IT field when they started manufacturing hard drives. A family member that worked in IT in the 60s and 70s said that he has a hard drive that is 2MB. That is a true 2MB drive. A few companies started using a modified definition for MB back in th day because they thought they could get away with it. They did, for a long time. Fast forward to about 10 years ago and people started noticing their hard drive wasn't as big as they thought. Then about 5 years ago companies lost a lawsuit and got the name changed because it was inconvenient for them.
Changing the definition of a kilobyte from 1024 to 1000 is a horrible idea. It's broken, it can't be fixed. The cat is out of the bag. Give the unit of 1000 a new name, don't try to change things later. What do you think would happen if I suddenly pushed through a change to the standard for Fahrenheit that made the freezing point of water -2C an boiling point +104C. Everything would go to shit all over the place. Your body is no longer 98.6F. You have an oral thermometer in your medicine cabinet? Is it the "old Fahrenheit" or the "new Fahrenheit". If a doctor is trying to diagnose your medical condition and he sees your body temperature was 99.4F years ago, was that the old system or the new? Did you actually run a fever? How about that cooking thermometer, the refrigerator you own, the hot water heater, the thermostat in your car engine, etc. YOU CANNOT do a standards change without seriously screwing up alot of stuff. PERIOD. It's not about right and wrong. It's about not doing something stupid. What if we suddenly changed the definition of a "watt" of power because the electric companies wanted it to be 1024 instead of 1000?
Regardless, I think I made my point. A kilobyte has always been 1024 bytes with regards to computing. Suddenly changing that 1024 bytes to 1000 because some geeks screwed the pooch doesn't mean we should change it now. It's too late. The cat is out of the bag, has been for 50 years, and died out of the bag. We're stuck with it.
Real world example of this BS going wrong: RTDs(resistance temperature detectors) have many standards that have come and gone over the years. Two dissimilar metals in a given mixture cause the resistance of the metal to change as temperature changes. By reading this resistance you can calculate the temperature at the location of the RTD. Back in the 70s we bought our RTDs from a company in Germany. They had a standard(one of about 15, and none of them were cross compatible) that was unique to their company. Each company wanted their own standard(proprietary for the win!). Of course, the German made RTD had some good advantages, and when the industry decided they needed to establish a standard every single company wanted THEIR RTDs to be "the" standard. Eventually things narrowed down to just a couple of potential candidates. The German name for their standard was kept, but the slope of the RTD was changed. Since the slope was changed you had a problem. Depending on what year it was manufactured(NOT the year you bought it!) and what company made it(some are rebranded from other companies or from MULTIPLE other companies!) PLUS what contract you purchased it under(you always want the RTDs to be identical for a given application, right? and hopefully the company you just overpaid for your RTD did their homework and made sure your RTDs did actually match each other... which they didn't all do) your RTD may be one of two possible choices. If your RTD was manufactured under the "pre-standard" standard then you had a certain resistance versus temperature chart. If you were "post-standard" standard then you had a different chart. How do you know which chart you had? You had to pull the RTD out and expose it to a known temperature and look at both curves and figure out which one you were on. But some RTDs were designed and installed so that they can't be removed. So do you roll the dice and guess? And what do you do when you print out the chart from the 70s that''s old and faded and print out a clean chart from the internet and they don't match!? If you're like me those RTDs were made and installed before I was born.
Guess what this RTD is used for?
Monitoring temperatures for a freaking nuclear reactor! Yes.
So WTF should we do now that the definition has shifted because the industry shifted? Guess what we do.. we spend HUGE sums of money to replace RTDs that were intended to never be replaced and people get unnecessary exposure to radiation because the industry was a bunch of asshats.
After seeing how engineers can completely screw shit up so badly, I've been made a VERY FIRM believer that you shouldn't change the definition of something because its inconvenient. Broken or not, for better or for worse, it is what it already is.
So what do I call someone that knows that a KB is 1024 bytes and works in the IT industry? Experienced!
What do I call someone that says that a KB is 1000 bytes and still works in the IT industry? Someone I'm scared to work with.(No offense to anyone in this thread.. I knew this topic would be a major problem when I first read about the push to change the names and the "bi" was a potential solution more than 10 years ago). I heard the "bi" was started as a joke in an IRC channel during discussion of this "issue" in the 90s.
What do I call an engineer that says a kiloton is 1000 tons? Experienced!
What do I call an engineer that says a kiloton is 1024 tons? Someone that should be unemployed.
But anyway, hard drive manufacturers can't have their naming both ways. They don't sell you a 3TB hard drive with 4kibibyte sectors, do they? NOPE. They sell you a 3TB hard drive with 4KB sectors. It's clearly printed on the box! Well, get your asses in gear and figure out what the fuck is TB/MB/KB and what is TiB/MiB/KiB. They'd surely sell you a hard drive with their version of 4000 byte sectors if they could. Too bad most file systems won't be able to cope because 1KB has been 1024 bytes for...ever. If the damn jerks that created this monstrous mess can't even get THEIR paperwork right on their own products what makes me think I should ever trust them?
Hard drive manufacturers failed at life when they chose to redefine the TB/GB/MB. The crappiest part of my life as a direct result of this mess so far is that I have to deal with postings every few days that "my zpool is missing space". I'm sure over the next 30+ years it'll get quite old as more and more people realize that a TB isn't a TB thanks to a 'reimagined' definition of a TB between 1995 and 2007ish. I'm sure Apple chose to make a KB 1000 only because they were tired of the lemmings calling the Apple support hotline to ask why their cool new Mac doesn't have as much hard drive space as they were sold.
While I may technically be wrong because I still call 1GB = 4096MB today, the hard drive makers created the mess when they created the new standard. Before their standard it was accepted practice(perhaps even a standard) that a kilobyte was 1024.
I should also add that Kilo, Mega, Tera is part of the metric measuring system. There is no "metric" or "imperial" measuring system for bytes or bits. It
had ZERO actual direct bearing with anything outside of the metric system until some schmucks in the hard drive industry wanted to change that and make it formal.