There seems to be some kind of a "flap" concerning the implications of Apple "working with the government on security". I think it is pointless to worry. Rembember OS X is based on an open-source core, Darwin, and that its sourcecode is free for everyone to download and read (or at least for those who have the technical expertise and inclination to do so). So any tinkering with cryptographic key escrows mechanisms, covert back doors etc. would be very rapidly identified. And if you are suficiently paranoid to think that Apple might load kernels onto your machines that aren't strictly speaking derived from the Open Source code in the Darwin project - i.e. if you fear the code has been deliberately trojanised after the event - you can always go ahead and compile your own kernel from source. Help from techie friends is strongly advised.
I'd also like to point out that one of the largest purchasers of NeXT equipment and software was the federal government of the United States. Jobs, being the erstwhile CEO of that company, and OSX being the natural successor to the NeXTStep OS and OpenStep environment, will be eager to re-acquire that very profitable market. Inevitably that requires some verification and testing procedures. I don't see any cause for serious alarm. Vigilance, maybe; but not alarm. At the very least, not yet.
I'm pretty sure Jobs will show off some impressive new features and harp on about how this-and-that Changes Everything. Personally, I have two simple hopes for OSX 10.4.
Firstly, I hope they finally manage to put together a decent Finder. Unlike most, I really like the Brushed Metal theme and as an oldtime NeXT user I appreciate pretty much every controversial aspect of the GUI. But I sure wish they could make the Finder multithreaded so its mount operations etc. weren't so slow and it preserved its responsiveness. Right now, as of 10.3.3, I find the Finder slow and unresponsive, and prone to crashing. We've all experienced it: glacial mounting of SMB shares and FTP directories, crashes when you move around a bunch of Previewed graphics files, the Trash getting itself jammed, etc. Isn't it about time they fixed this pathetic state of affairs?! They should have a feature freeze until they manage to make the whole Finder experience more bearable. Fix it, then improve it. Don't improve it and neglect to fix it.
Secondly, I hope they recompile everything with IBM's new optimised-for-PowerPC XCC compiler rather than GCC 3.3. GCC may well be the standard for portability and even offer decent performance under x86, but when it comes to PowerPC code it really is inefficient and wasteful. Compiling with XCC can sometimes boost performance by more than 20%. A fifth faster: I'm sure that'd make for a lot of very happy users. I'd certainly be one of them.
As for the metadata filesystem: good job. I hope they implement this. It's one of those features that will open up a whole range of hitherto unimagined possibilities. For both consumers and professional users, it could prove to be an excellent upgrade. Forget about iPhoto- and iTunes-like metadata systemwide. Think of user-definable extensible filesystem permissions: philosophically akin to Folder Actions, but at the filesystem level, and systemwide! Geeky? Maybe. Useful? Definitely.
spankalee said:
That the NSA put a backdoor into DES has been a rumor since the '70s. This started because the NSB (now NIST), asked the NSA to review Lucifer, IBM's encryption method that became DES. NSA came back with two changes. First they reduced the key size to 64 bits (effectively 56) from 128, and second they changed the table of values that Lucifer used. The first request is probably so that they could do a brute force break if they need to. At the time 128 bits would have been nearly unbreakable, but 64 bits could be broken with a bit of time and money. The second change is what made people think that there was a backdoor because the NSA didn't offer a reason for it. However in the almost 30 years since DES was released no one has found a backdoor, and many, many crypto-analysts have been looking for one.
Actually, it is likely that the reason has already been found. In the early 1990s civilian (non-NSA) cryptanalysts developed a "new" and very powerful technique called "differential cryptanalysis", that basically allows you to guess the key of DES-like ciphers by encrypting different (known) cleartexts and observing the ciphertext that comes out. By comparing the different inputs and outputs, a reasonable guess can be made as to the key. But guess what: DES was found to be very resistant to this attack. Why? It was discovered that the exact nature of the S-boxes (the "substitution boxes" or "tables of values", as you refer to them) were formulated in such a way as to thwart the technique. What does this tell us? Firstly, that the NSA had already developed differential cryptanalysis (or an analogue thereof) in the mid 1970s, a full two decades ahead of its rediscovery by civilians. Secondly, and this is the clincher, it tells us that the NSA didn't weaken DES: it
strengthened it immesurably. Talk about irony. (Bruce Schneier discusses differential cryptanalysis in his definitive tome
Applied Cryptography, Second Edition.)
I'd like to add that breaking the 56-bit encryption of DES by brute-force is only just within civilan capabilities now. I think it is reasonable to assume that at the time (mid-1970s) it is unlikely that the NSA possessed the computing power to break DES by brute force, or by any other means. Furthermore, "strong" 128-bit encryption such as that offered by AES is totally unbreakable by brute-forcing with a conventional (non-quantum) computer. I remember reading that iterating through each and every one of the keys in a 128-bit keyspace, even if each keybit were represented with a single electron, would require more energy than the mass-energy equivalent of the sun. I can't remember where I read it, but it was a sufficiently reputable source for it to remain stuck in my mind.