Damned if I know, not really. But at least I have a vague idea now, after puzzling through this explanation given by Edward Snowden to ACLU lawyer Ben Wizner. Read the whole thing here.
ES: It’s basically just a new kind of database. Imagine updates are always added to the end of it instead of messing with the old, preexisting entries — just as you could add new links to an old chain to make it longer — and you’re on the right track. Start with that concept, and we’ll fill in the details as we go.
BW: Okay, but why? What is the question for which blockchain is the answer?
ES: In a word: trust. Imagine an old database where any entry can be changed just by typing over it and clicking save. Now imagine that entry holds your bank balance. If somebody can just arbitrarily change your balance to zero, that kind of sucks, right? Unless you’ve got student loans.
Is this indeed true? The long history of FBI incompetence leads me to suspect it is, but hey, what do I know? Do you know?
So here is what is true.
Apple could have recovered information from the phone had the Apple ID passcode not been changed under orders from the FBI, Apple said. If the phone was taken to a location where it recognized the Wi-Fi network, such as the San Bernardino shooters’ home, it could have easily been backed up to the cloud. The FBI then lied about whose incompetence lead to the mistake.
In other words, while the FBI is demanding massive changes in how Apple protects your privacy, none of those change would even be necessary if anyone on the government side understood how iCloud works. And these guys want us to believe we can trust them with our data, and indeed, our freedom.
Nation editor and publisher Katrina vanden Heuvel and contributing editor Stephen F. Cohen interview Edward Snowden in Moscow. Read the whole thing here. Snowden is a compelling figure, way above most of his detractors in both intelligence and love of country.
What defines patriotism, for me, is the idea that one rises to act on behalf of one’s country. As I said before, that’s distinct from acting to benefit the government — a distinction that’s increasingly lost today. You’re not patriotic just because you back whoever’s in power today or their policies. You’re patriotic when you work to improve the lives of the people of your country, your community and your family. Sometimes that means making hard choices, choices that go against your personal interest.
People sometimes say I broke an oath of secrecy — one of the early charges leveled against me. But it’s a fundamental misunderstanding, because there is no oath of secrecy for people who work in the intelligence community. You are asked to sign a civil agreement, called a Standard Form 312, which basically says if you disclose classified information, they can sue you; they can do this, that and the other. And you risk going to jail. But you are also asked to take an oath, and that’s the oath of service. The oath of service is not to secrecy, but to the Constitution — to protect it against all enemies, foreign and domestic. That’s the oath that I kept, that James Clapper and former NSA director Keith Alexander did not.
MIT professor Les Perelman wondered how computers would score at scoring student papers:
“As our government agencies and various reform efforts seek to shift high stakes testing away from multiple choice questions, there is growing interest in computer programs that can read and score student essays. But questions persist, given the limitations of the algorithms these programs use.
“So Mr. Perelman has done an experiment. He created something he calls the Basic Automatic BS Essay Language Generator, BABEL for short. During his interview with Carol Off, Perelman fed his machine a topic she suggested, ‘Fair Elections Act.’”
Here is what the BABEL machine provided in response:Fun fair for adherents and presumably will never be altruistic in the extent to which we purloin the analysis. Fair is the most fundamental postulate of humankind. Whiner to act in the study of semiotics in addition to the search for reality. Act is intrepidly and clandestinely axiomatic by most of the scenarios. As I have learned in my semiotics class, act is the most fundamental exposition of humanity.“Mr. Perelman then submits this essay for grading. The result, a score of 5.4 out of 6, placing this essay in the 90th percentile.”
Tony in Florida writes:
You asked about the status of BitCoin. Here’s what the U.S . Constitution says about coinage:
“The Congress shall have Power to coin Money (and) regulate the Value thereof.”
“No State shall coin Money (or) make any Thing but gold and silver Coin a Tender in Payment of Debts.”
So, where does BitCoin fit in with all this? Can a Federal bank accept BitCoin and pay up value in dollars in payment of debts? Can a State bank accept BitCoin and pay up value in dollars in payment of debts? Then why do some banks do it? Is it for anonymity in drug dealing and “dark site” trading?
One of the many virtues of BitCoin is that their transactions are untraceable (except by the NSA) until tendered for dollars, gold or silver. Trading profits along the way go unreported to, and undiscovered by, the IRS, in America and around the world. This is an irresistible advantage of BitCoin over other forms of coinage. They should pay Jamie Dimond in BitCoin. Then he wouldn’t have to pay any taxes at all.
Does anyone, including the U.S. Supreme Court, actually read and think about the document known as the U.S. Constitution? Or do we just say whatever we wished it said, such as the proposition that financial influence of our legislators and public officials constitutes protected “free speech,” or Corporations (unmentioned in the Constitution) are “persons” with all the rights of persons including free speech (but not any of the liabilities of actual human persons, such as serving in the military draft, serving time in the federal or state penitentiary for acts of criminal nature, or being executed in Texas?
Well, cheer up, we can always buy our way out of almost any cul-de-sac — if we use BitCoin.
Apropos of nothing but a hot summer day, check out these robots playing soccer. They aren’t there yet but you can see it won’t be long until human teams can no longer compete.
My first-line news site used to be iGoogle because it brought together most of the major sources such as BBC, the Guardian, WaPo, NYT, and so on on the same page with weather and whatever other widgets you chose to add. It was a simple and useful interface that I expect was based on Google Reader, which is why they’re both disappearing at the end of October.
Long story short, I’ve switched to NewsBlur. With NewsBlur I can view headlines from any news source with an RSS feed, which means nearly everybody; we have one near the top of the page in the right-hand column, “Syndicate this site (RSS/XML)”. They’re easy to set up — RSS means Rich Site Summary but it’s often called Really Simple Syndication — so every major news source has some sort of feed, and many have specialized feeds for particular types of news.
The result of aggregating these various feeds is a presentation something like the following. Click on the image for a full-size view of it.
In addition to aggregating the wheat and not having to sort through the chaff, you can train NewsBlur to bring the types of stories that interest you by giving thumbs up or down to each story based on its title and tags.
This ability is limited by the news source’s tagging; some sources provide multiple tags, some provide few or none. You can also choose phrases in the title you want or don’t want. So it’s not everything one might want in the area of trainability, but those limitations seem to be in the content provided rather than the aggregating software. The stories you’ve flagged as uninteresting are still available, but by default you only get what you want or at least haven’t said you don’t want.
All in all, NewsBlur is a nice tool for seeing lots of headlines quickly. What I’ve found is that I hardly ever visit the actual websites of my major news sources like the Guardian because everything comes in through NewsBlur and the site itself is all duplicates. That’s not true for Talking Points Memo, almost alone among the sources I aggregate.
…Raise your hands. Don’t be ashamed. Now go here. Read the column that then to your wondering eyes should appear. Finished? Okay, now click on “Generate a New Column” at the bottom of the page. Rinse and repeat.
Had enough? Okay, finish up by going here.
Read this whole interview. These searches aren’t limited to journalists, independent or otherwise. The government can and does inspect pretty much anybody’s hard drive and flash card for any reason it wants or for no good reason at all.
Independent journalist Brandon Jourdan recently returned from Haiti after being on assignment documenting the rebuilding of schools in the earthquake-devastated country. However, when he returned to the United States, he was immediately detained after deboarding the plane by U.S. Immigration and Customs Enforcement agents. He was questioned about his travels and had all of his documents, computer, phone and camera flash drives searched and copied. This is the seventh time Jourdan says he has been subjected to lengthy searches in five years, and has been told by officials that he is “on a list.”
Jourdan joins us in our studio. Catherine Crump, a staff attorney for the American Civil Liberties Union, says that Jourdan is not the only one facing such treatment by the Obama administration. Crump says many journalists and lawyers who often work abroad have also experienced similar interrogations — and the ACLU believes the First and Fourth Amendments must be honored within U.S. airports…
“And ‘These are orders from Washington.’ And they copied my hard drive. They copied my laptop. They copied every single one of my compact flash cards that I use for my camera, which is absurd to me, because I was documenting people building schools and a country devastated by an earthquake…
This is the kind of high art you could produce too, if you had a pretty granddaughter, a four-year-old grandson with floppy bangs and a cool new iMac:
Hey, gang, Team USA is Number 15! Here’s yet another infrastructure outrage for you:
Since 1991, the telecom companies have pocketed an estimated $320 billion — that’s about $3,000 per household.
This is a conservative estimate of the wide-scale plunder that includes monies garnered from hidden rate hikes, depreciation allowances, write-offs and other schemes. Ironically, in 2009, the FCC’s National Broadband plan claimed it will cost about $350 billion to fully upgrade America’s infrastructure.
The principal consequence of the great broadband con is not only that Americans are stuck with an inferior and overpriced communications system, but the nation’s global economic competitiveness has been undermined.
In a June 2010 report, Organization for Economic Co-operation and Development (OECD) ranked the U.S. 15th on broadband subscribers with 24.6 percent penetration; the consulting group, Strategy Analytics, is even more pessimistic, ranking the U.S. 20th with a “broadband” penetration rate of 67 percent compared to South Korea (95 percent), Netherlands (85 percent) and Canada (76 percent). Making matters worse, Strategy Analytics projects the U.S. ranking falling to 23rd by year-end 2010…
I know as much about broadband as I do about the Emperor Hadrian, but I have a mole planted deep within a giant telecom company. She reports as follows:
Well, the news that we are way behind much of the world in connect speeds is right, but I don’t understand many of the other claims. The telcos definitely grab whatever they can get from deals with the PUCs, but from what I can see, that usually does not amount to that much.
What a lot of confusion and inefficiency arises from is that the PUCs will require the telco (in exchange for some rate break or something) to build out their infrastructure such that some number of folks are *able* to order a broadband connection. There is never a requirement to actually *sell* the service.
The telco will then plow in fiber, deploy equipment, etc., to fulfill their obligation to offer service to some god-forsaken county in the middle of New Mexico with 10,000 people in it. Then, 83 of them actually sign up for service. So, assuming that the rate break or other incentive actually did result in more telco revenue, a lot of it has to be spent on the buildout to service those 83 people.
Nobody walks away a winner. The PUC is mad that the hicks are still not online, the telco is shaking their heads saying “I told you nobody would buy it! We’re gonna have to keep that crap running for years!”, the 9,917 folks that still have no broadband still can’t see their YouTube, and everyone is sad that we are another step behind in the race to connect everyone.
So, it really is not some gift to the telcos. Neither is it money well spent in connecting folks to broadband. It is the worst of both worlds — little extra broadband penetration, no telco windfall, and only a bunch of aging equipment deployed with little chance of ever being used. It is really just an inefficient regulatory effort to accomplish something with not enough information or control.
Probably the only way to fully connect the boonies is to re-regulate and force the issue that way. It is just too expensive to do it otherwise.
The French government has now joined the German one in recommending that its citizens find a browser other than Internet Explorer. Microsoft continues to claim that this particular weakness is only in IE6, but no techie types appear to buy that argument, for a variety of reasons. Opera!
Anyone see a mention of this in American media? I haven’t googled, but my news aggregators failed to show any headlines about it with US sources.
The German government has warned web users to find an alternative browser to Internet Explorer to protect security.
The warning from the Federal Office for Information Security comes after Microsoft admitted IE was the weak link in recent attacks on Google's systems.
Microsoft rejected the warning, saying that the risk to users was low and that the browsers' increased security setting would prevent any serious risk.
As you might expect, what Microsoft proposes is to set the security level on the browser to high, effectively blocking out a large proportion of the usable web. As useful as most advice you get from Redmond.
I continue to recommend Opera, but anything other than IE is a move up.
Here’s what was happening while our “best and brightest” were studying economics at Harvard (a notorious gut major when I was teaching there) so they could be yuppie investment bankers and never grow up.
Last month, my news assistant came in with a new Blackberry. Only it wasn’t a Blackberry. It was a cheap Chinese knockoff of a Blackberry. Of course, the Chinese knockoff wasn’t the same as a real Blackberry. It was better.
He’d had a real Blackberry for six months — bought it on a trip to the US for $400, then had to pay another hundred or so in Vietnam to get it unlocked for local mobile service — and it was inconvenient and flukey. The new one, he found easier to use. The parts, obviously, were exactly the same — they clearly came from the same factories. But he even found the Chinese software more convenient. They were adding features that hadn’t existed on the “real” Blackberry. The knockoff cost $150.
I thought about this after reading this Derek Thompson Atlantic Business post referencing BusinessWeek’s Michael Mandel’s article arguing that the US may be losing its innovative edge. Mandel points out that the US ran a $30 billion trade surplus in advanced tech in 1998. By 2007 it was a $53 billion deficit. Thompson asks: “Where Mandel’s explanation comes up short is: What are these innovators doing wrong?”
The example of the Chinese knockoff Blackberry suggests that maybe US innovators aren’t doing anything wrong. It’s just that they’re now competing against Chinese innovators, where they weren’t 10 years ago. This may have happened for two reasons. The first is that lack of intellectual property protection, combined with the outsourcing of manufacturing for all those high-tech products to China, gradually destroyed the US’s technological edge.
The second is that in 1998, China didn’t have very many top-flight engineers. But they’ve spent the last 10 years doing nothing but graduate engineers, and now, they do. And that changes everything.
All right, now I feel old: “The Macintosh — the first Apple computer to bear the name — turns 25 on 24 January.” A quarter century since I retired my two Kaypros and moved up to that brand-new Mac with those unbelievably huge floppy disks? You could do a search-and-replace on an entire book manuscript with one of those 3.5-inch beauties.
The most important geek news of the week is the court decision (PDF) in the case of Jacobsen v. Katzer, in which a violation of a non-traditional copyright was held to be just like a violation of a traditional copyright, with the same enforcement mechanisms.
The copyright holder in the case is Robert Jacobsen, the lead developer of the Java Model Railroad Interface, a software package used by model railroad enthusiasts. A firm called Kamind Associates downloaded parts of Jacobsen’s project, stripped out the copyright notice and other identifying information, and began redistributing the modified version without Jacobsen’s approval.
Copylefts, as they’re sometimes called, grant more rights to users than traditional publishing or media organizations have. The Creative Commons Attribution license recently added to our sidebar is the “By” license, the loosest level: anyone is free to redistribute, remix, and make commercial use of licensed material, as long as proper attribution is included.
More restrictive options exist as well. It’s possible to prohibit commercial use, or to allow redistribution only if the redistributed work itself carries an equivalent license, for example. If you want to license your website, you can do it in five minutes: first choose the appropriate license at the Creative Commons site, then copy and paste the HTML that’s provided wherever you want it on your web pages.
This is good news for Flickr users and bloggers and other such folks who want to share the products of their imaginations or skills. But it’s particularly great news for the free software community. I’m thinking there were some glum faces in Redmond this week, out of which Bill Gates, as I’ve said before, hauled ass at a propitious moment.
At the personal computer level, free software is today both more powerful and easier to use (and maintain) than corporate software. What keeps the dinosaurs going is control of the hardware environment, and specialized applications. Linux has to work everywhere, with every language and font and screen and central processor and network interface; Windows systems are much more proscribed, and the Mac is another universe. Macs have cool media-creation and -editing apps, for example; Windows programs in that area are improving, but it’s hard to make a quality product in a Windows environment. I’m not kidding; I’ve built apps on Windows, Mac, Unix, and a couple other OSs, and Windows is the least reliable. Mac is probably the quirkiest; a fair amount of it is there just to be different. Unix is superficially the most obscure, but in fact the most sleekly and reliably designed of the three (though DEC’s VAX/VMS far surpassed Unix).
In the classic critical-mass fashion, state-of-the-art media manipulation software hasn’t yet migrated to Linux. But for the more quotidian operations such as browsing the web, doing email, cataloging, watching, and listening to media, fiddling photos, and doing MS Office-style stuff, the Linux tools are superior in function and ease of use. Plus, they’re almost universally faster at the same operations.
This kind of quality has not always been there in open-source software, it’s true; but then commercial software is no walk in the park either.
Generally, open source has an outstanding record of providing reliable and useful software. If you spend the effort to build something, package it, and distribute it for free, you must actually have some ego invested in it. If you care about it enough to maintain it over a period of years, coördinate assistants in that process, and accept contributions and consider requests from users, it becomes something like your child.
This kind of approach tends to create communities. When the original impulse is to solve a problem, and the first contributors all face that problem and are coöperating on solutions, what emerges has passed all the tests that its designers thought of, which means at least it solves the original problems. Things that work well tend to get adapted to other situations rapidly; if your product doesn’t evolve, it was probably a pretty simple idea to begin with. If users are soon thinking of uses you never imagined, that’s a sign of success.
The court ruled in Jacobsen v. Katzer that copylefts are enforceable as copyrights, overruling a lower court decision that this was not a copyright violation but a violation of contract. Copyright laws are much stricter, so this and some prior, more limited, rulings are clear encouragement to the free software community. Work can be done in a non-capitalist fashion, and distributed, used, and relied upon world-wide, without the capitalists either stealing it or shutting it down.
As ours becomes better than theirs, they’ll go under.
In a previous post I talked about the specter of open-source software in Microsoft’s rear-view, nearly side-view, mirror. Later in this post I’ll discuss my experiences over the past year running the Linux operating system.
But first let me make my case to those who plan to exit before the geeky stuff: If you’ve ever been pissed off at your computer and thought of switching to Linux, now is a great time to spend an hour or so taking it out for a spin.
The first section tells which types of computer users can benefit from switching to Linux. Next are discussions of why Linux is more fun to use, followed by brief instructions on how to download the file, burn it to a CD, and boot with it for a Linux test drive.
For a long time the knock on Linux was that not enough applications ran on it; later it was that the applications weren’t of sufficient quality and versatility to replace existing systems. This started as a fair, and later became a partially fair, characterization. At the margins of usage, it’s still true that there are fewer Linux applications. But the margins have moved; these days most people depend on their computers to
Linux now has excellent replacements for all these applications. Occasionally there’s a little more setup, because Linux handles far more hardware combinations than Windows or Mac software. But in my experience the quality of Linux applications normally exceeds that of the programs I used on Windows, often by quite a bit. Linux software follows the Mac philosophy of presenting only those menu choices that make sense in the current context; but Linux presents a lot more choices, and has a whole lot more configurability.
If this list covers everything you do with your computer, then you’re right down the middle of the audience for a Linux distribution like Ubuntu.
The name of the distribution comes from the African concept of ubuntu which may be rendered roughly as “humanity toward others”, “we are people because of other people”, or “I am who I am because of who we all are”, though other meanings have been suggested.
Ubuntu had a brush with fame when it was written up in non-tech columns several months back after the Dell website revealed that Michael Dell ran Ubuntu on one of his many home computers. Ubuntu uses the Gnome desktop manager, which looks a good deal like Windows on the surface, but is smoother and more coherent like a Mac in design and function. Users who want greater breadth of control might prefer my favorite distribution, Kubuntu, which replaces Gnome with the more configurable KDE desktop manager.
In any case it’s not a definitive choice. Ubuntu and Kubuntu are both official branches of the same project. Pretty much any program that runs on one will run on the other; you can even run the Gnome desktop manager on Kubuntu or the KDE desktop manager on Ubuntu if you choose. Hey, it’s your system.
Another recommendation for Linux is that installation packages have improved dramatically in recent years.
The kicker is, you can make a bootable CD, which will run Ubuntu or Kubuntu without touching your disk or affecting your system in any way. When you’ve downloaded the file and burned it to CD, you can boot with the CD, check out the new OS, then reboot without the CD and be back to your existing system. (If you’re running Windows now, you won’t be able to read or write your Windows files until you install the OS plus a package that reads and writes Windows files. Which, natch, is free.)
It’s difficult to list the reasons that Linux is better, because (1) there are so many of them, and (2) what’s most important differs from user to user. But here’s what strikes me most, in no particular order.
This is far from a comprehensive list, but it provides a flavor. Now to the instructions.
Here’s all you have to do to make a CD that will (probably) run Linux on your computer.
I say “probably” because there have been reports with each new version that some people cannot boot with the Live CD. When I first installed Ubuntu over a year ago I was one of them, though the Live CD for Kubuntu worked fine. If you can’t boot with the Live CD, bu want to try installing anyway, use the Alternate CD, which will install even on systems that won’t boot with the Live CD. (Alternate CD data is available from the same websites as the Live CD data.)
But everyone I know who’s tried Kubuntu 8.04, known as Hardy Heron, has been on the net as soon as they booted with the CD. Assuming, of course, they were on the net in their standard configuration.
It’s quite simple to make a bootable CD with Ubuntu or Kubuntu, which will run without affecting your existing system in any way. At the overview level, there are four steps.
Naturally parts of Linux will run slower when it’s loaded from a CD than if it were installed, and you won’t be able to save configuration changes — you haven’t given it any disk space where data can be saved. Otherwise, though, it will work just like a newly installed system.
If you’re a little nervous about the whole prospect, you might start with a fine guided tour of Ubuntu, with lots of screenshots and good information. Ubuntu aims at friendliness, and is in a sense Linux’s answer to the Mac.
If you’re confidantly looking at how sophisticated Linux has become, I recommend Kubuntu. It’s very powerful, yet very coherent. It’s not really the Windows to Ubuntu’s Mac because Kubuntu actually works, and it doesn’t try to fool you into thinking it can do something it can’t. Either the function is there and easy to use, or we haven’t figured that part out yet, no bones about it.
There are many others, such as Red Hat and openSUSE; but these are the two I’m familiar with. Once you’ve decided on a distro, as the Linux cognoscenti say, you’re ready to begin.
ISO files represent data in a format that a computer can boot from. This is what they call the “Live CD” data. Many non-Linux systems see it as an ugly file that should be repaired before it’s copied. Thus you need to proceed to the second step.
Because the bootable CD is not a standard file from the Windows or Mac points of view, you can’t just drag-and-drop the file onto a CD. On Windows you need a tiny special program to burn it. Macs apparently have a similar program built in; follow the link for instructions on finding it.
On Windows, right-click the file and find the program you downloaded in Step 1. On Mac, find the appropriate binary-file writer and write the ISO file to CD.
This should bring up whichever OS you downloaded. Now you can browse the web, try out office products, and so on in exactly the environment you’d see after a fresh install.
Even those with absolutely no interest in computers might have noticed the recent departure, complete with tears, of Bill Gates from his baby, or his monster depending on your viewpoint. He’s leaving Microsoft to concentrate his energies on giving away money.
Or at least that’s the official story. My counterclaim? Well, I never took him for the genius he’s widely reputed to be, but I think he’s smart enough to get out while the gettin’s good. Following years of fortune, the William F. Buckley method of software manufacture faces a future of reduced impact. You’ll recall that on starting the National Review, Buckley said of his magazine, “It stands athwart history, yelling Stop, at a time when no one is inclined to do so, or to have much patience with those who so urge it.”
Indeed. Is this a dark view of human nature, that we can’t help but bring ruin on ourselves? Or is it a contemporary version of the old uneasy-lies-the-head?
Perhaps the only real innovation Gates brought us was his contention that he owned the spirit that lived on the plastic disc he sold. Basically he realized that if he stood athwart the evolution of software, he could charge for each stage of the journey instead of selling the whole thing at once, thus improving profit margins. It was only later that he realized how much money could be made through intentionally poor design and development techniques.
To me this appears to be a Big Con, but he made it work, portioning out to an adoring audience — or at least a continuously spending one — features that were old hat on real computers, followed by fixes for the bugs the features created, all in exchange for a continued income. And what an income!
Gates, in other words, was a pioneer of so-called intellectual property, a concept to which I have too many objections to list at the moment. Protect and encourage innovation with the patent system, to be sure. But as Windows users came to realize, Gates only invented ways to game the system, and there’s plenty of prior art in that area.
Then there’s the poor design and low quality of the Microsoft product fleet. Not to mention the shallow documentation. Or the high prices. Or the ridiculous strategy of security through obscurity. Or the Microsoft attitude that their license is more important than anything related to you. And did I mention how sucky the products are? You really notice it when you start using, for example, Amarok, which destroys any Windows-based media player I ever saw.
In fact there are way more than a tech blog’s “Five Reasons The Intel-Microsoft Duopoly Is Dead”, some of which are offered in the comments to that post. One of the demons (perhaps in this context I should say daemons) in Microsoft’s rear-view mirror has been closing fast so fast it may already be about to pass:
The emergence of free software could be hurting Microsoft’s bottom line. The company said that sales of its Office products among consumers dropped 39% in the most recent quarter. The company blamed most of the decline on the fact that the previous year’s third-quarter results were significantly boosted by revenue that had been deferred under an Office 2007 upgrade program.
Still, consumer sales of Office have shown no growth over the past three quarters, Microsoft said. The problem: Microsoft’s Office revenue typically jumps when a new version is introduced, then quickly tapers off.
With Equipt, Microsoft is hoping to extend the consistent revenue stream provided by commercial Office licensees to the consumer market, and it’s hoping that everyday computer users will see enough value in the offering to forgo free software.
Open-source software is clogging up the works for the folks Bill leaves behind. It’s messing with the business plans of corporations whose income depends on the proprietary nature of their software products. It may even begin to change society. If that sounds silly, remember how it sounded some years back when the geeks were telling us the internet would change everything.
The two most famous pieces of open-source software these days are probably Firefox and Linux, but there are lots of others. OpenOffice.org is a free open-source replacement for Microsoft Office that will be familiar to Office users in look and feel; it has everything most people expect from Office, including the ability to read and write MS file formats as well as many others. I don’t use word processors and spreadsheets and the like very often, but I’ve relied exclusively on OpenOffice.org products for seven or eight years, and have been very happy with them.
Feeling the heat, Microsoft has come up with an offer they think you can’t refuse: only $70 a year for the award-winning Office suite. Or you can use OpenOffice.org for free, and if you feel the need to fork over $70 I’ll send you my PayPal info.
In the big picture, the day of dominance for corporate-behemoth software is passing.
Early on, companies like IBM and DEC made big bucks “pumping iron”. Manufacturing useful and reliable computing hardware has always been a non-trivial job. In those days the few companies around the globe that could muster the necessary technique had a ready market for their hardware.
Given their unique knowledge of how the hardware worked, the manufacturers had a monopoly on the software that ran on that hardware, something like Apple’s current setup but much more restrictive. Every hardware manufacturer followed this strategy, so consumers found their choices limited. In exchange, they often got very high quality products and services, from IBM and DEC for example. By high quality I mean things worked and kept working; oil-service companies bought DEC machines because they’d run for ten years continuously, outside of monthly preventive maintenance stops. Competition was fierce, but rapid growth made room for several competitors to expand concurrently.
Today, scarcity is enforced by the enormous cost of building the manufacturing facilities. When I worked in a job related to the fabrication of semiconductors some years ago, the cost of a new fab was one to two billion. (And that was back when the US dollar kicked global financial butt.) Operating costs are such that you normally have to sell enormous quantities of chips, but our ravenous appetite for intelligence in the objects around us creates the possibility of huge profits. Which is why they’re so cheap we put them in watches and phones, cars and washing machines, pets and pajamas.
Cheap chips give all kinds of people all over the world enough computing power to prompt widespread daydreams of building very large systems entirely without corporate influence. Current versions of Linux prove that such dreams can be realized. Software that’s more reliable, less bug-prone, with more features and easier to use, for free, including continuing updates as they become available: corporations can’t beat that, and even Microsoft can’t buy an organization that doesn’t exist.
Robert Anton Wilson fans will recognize the essence of the Discordian spirit in the open source movement; political types might catch a whiff of anarchism. But it’s hard enough to pinpoint that this one might hope to survive a little longer than similar movements in the past.
The open-source approach is unlikely to take over the entire field; specialization is nearly always the path to the highest profit margins. But in infrastructure areas such as operating systems, compilers, and networks, open source is already the way to go. And today’s add-ons are part of tomorrow’s infrastructure.
To a certain extent the old model depends on people being paid to work long hours fulfilling the dreams of corporate marketing departments for the benefit of executives. Whether that’s exploitation or just the way twenty-first century state capitalism works, it continues the American tradition of great ideals not lived up to. A Louis Kelso-style system would distribute the gains of capitalism, which Greider calls the greatest wealth-creation humanity has yet created, more equitably.
Approaching either of the economic opposites, complete equality of wealth or complete concentration of it, brings conflict; too close an approach can bring revolution. America’s financial system has become skillful at riding that line. How many corporations in recent years have been caught profiting from third-world sweatshops? Not to mention how many are paying no taxes while accepting all the services the community supplies; worse yet, polluting their surroundings and leaving us to suffer from it and pay for its cleanup.
The open source movement is not going to fix the problem of world hunger, at least not directly. But it has finally reached the point where it can encourage people to wean themselves off the corporate teat by offering a tastier product, and that is a classic example of what Chomsky’s Establishment considers the threat of a good example.
An upcoming post will detail my experiences with Linux over the last year. But to emphasize the meaning of events rather than their technical details: I claim that, though it’s still in an early stage, this new model threatens to change social norms as well as corporate boardrooms. In another post I might explore the possible social repercussions of open source, but this one’s already too long.
All in all, I wouldn’t short Microsoft stock yet, but for Gates it’s a good moment to decide to spend more time with the family.
[H/t to Hugh Macleod for the proposed Microsoft business card. His website has a bunch of good stuff.
Also, three extra points to anyone who gets the tilde joke.]
From McClatchy Newspapers:
WASHINGTON — U.S. border agents are copying and seizing the contents of laptops, cell phones and digital cameras from U.S. and foreign travelers entering the United States, witnesses told a Senate subcommittee Wednesday.
The extent of this practice is unknown despite requests to the Department of Homeland Security from the Senate Subcommittee on the Constitution and several nonprofit agencies.
The department also declined to send a representative to the hearing. Subcommittee Chairman Russ Feingold, D-Wis., said Homeland Security had told him that its “preferred” witness was unavailable Wednesday…
(Ed. note: Wednesday was his day in the reading room.)
…to figure this Bush program out. So the Headless Nail has done it for you:
The Bush administration is quietly but firmly trying to set in place the capability to monitor, intercept, and analyze all visits to federal government web sites. It's called the Einstein program, which no doubt has the old civil libertarian and FBI target spinning in his grave.
Once such a system is pounded into place, it becomes, like me, a headless nail in the bureaucratic machinery. Both of us are almost impossible to pull out. So here's what you have to look forward to:
If you visit any government web site, the government could monitor your visit, know all of the pages you have seen, and capture and analyze any information you send or receive — all in real time. It would be like having your very own Big Brother, looking over your shoulder at your very own screen.
And taking notes as you surf.
This program, known as “Trusted Internet Connection” would require that all federal agencies access the web through portals approved and controlled by the Department of Homeland Security.
At each portal, DHS would install an “intrusion detection system” — Einstein. Details about Einstein are sketchy, but it will capture at least all traffic flow, source and destination IP information, and data sent or received.
In all probability this electronic gatekeeper would allow Homeland Security to spy on government employees too, which will be handy for tracking down whistleblowers.
The ostensible reason for the program is, of course, protecting us against terrorist hackers. DHS officials won’t say much about how they will use this capability, so you’ll just have to trust them when they say that the “program is not intended to collect information that will be retrieved by name.” [italics added]
But then neither did the DHS intend to force airline passengers to remove nipple rings with pliers. Nevertheless that is exactly what its agents did to a woman in Lubbock last month. By the time even the best of intentions reaches the bottom rungs of a huge bureaucracy, the result can defy logic and common sense. To say nothing of common decency.
Although the Administration wants this program in place by June (unlikely for technical reasons), DHS has not provided the legally-required Privacy Impact Assessment for the project. So we don’t know what personal information will be collected, how it will be used, or what (if any) safeguards against spying on citizens will be required.
All government web sites are required to post privacy policies, and in my experience government webmasters take this responsibility seriously. Under the Bush plan, these protections would become meaningless, as DHS would position Einstein between the citizen and the government site.
Note that the Einstein program does not require the cooperation of any private partners (such as phone companies or ISPs) and is not subject to any routine judicial supervision — helpful if you want to avoid any embarrassing leaks or disclosures about how it is actually being used.
In summary, the Bush Administration proposes to acquire a powerful new domestic electronic spy network, and citizens are supposed to trust the good intentions of Bush's DHS and Justice officials that these powers will not be misused. Domestic political opponents, whistleblowers, and ordinary citizens who don’t want the government spying on their web visits will be forgiven for their skepticism.
Back in the late seventies, when opinions about the possibilities of artificial intelliegence were harder to find but no less broadly speculative, I encountered a wonderful book, both erudite and charming, called Computer Power and Human Reason. I was fascinated by the ideas in it and those it led me to. I was equally impressed by the simple fact of someone thinking at that level. Its author, Joseph Weizenbaum, died a week ago in Gröben, Germany.
What is it, after all, that makes people different from machines? Sure, we die, and they turn off. But do we think differently? Is one of our approaches preferable? This is the kind of stuff Weizenbaum enjoyed considering.
In 1962, he published a comparatively simple program called ELIZA which demonstrated natural language processing by engaging humans into a conversation resembling that with an empathic psychologist. The program applied pattern matching rules to the human’s statements to figure out its replies. (Programs like this are now called chatterbots.) Weizenbaum was shocked that his program was taken seriously by many users, who would open their hearts to it. He started to think philosophically about the implications of Artificial Intelligence and later became one of its leading critics. His influential 1976 book Computer Power and Human Reason displays his ambivalence towards computer technology and lays out his case: while Artificial Intelligence may be possible, we should never allow computers to make important decisions because computers will always lack human qualities such as compassion and wisdom. This he saw as a consequence of their not having been raised in the emotional environment of a human family.
…saw that computers adjusted human intelligence ‘from judgment to calculation,’ that they privileged mathematical models and instrumental reason as the basis for action, and that they created a paradoxical situation in which computers initially empower humans but will eventually render them powerless. he urged his colleagues not to put themselves in the service of the military and other death industries, and specifically called for computer scientists to refuse to conduct research on projects that would couple organic and mechanical systems, and to avoid speech recognition research because it would profoundly alter the way people understand one another.Assuming that computer capabilities continue to increase at something like a Moore’s Law rate, they’ll very soon overtake us in the areas of gathering and processing data. They can already land an airplane more precisely and reliably than a pilot and beat the world champion at chess. Will they outperform us at sex and cooking next? Music? The novel? Or will we, the creators, choose to draw some lines and create some definitions? Are some things uniquely human, or is everything an artifact of electricity and chemicals?
These questions are too big for even a particularly gifted thinker to answer alone. But Joseph Weizenbaum helped me realize that questions like these could, and in fact demanded to, be asked.
Joseph, good work, bro. Keep it comin’.
Who would you expect but Microslop?
The Times has seen a patent application filed by the company for a computer system that links workers to their computers via wireless sensors that measure their metabolism. The system would allow managers to monitor employees’ performance by measuring their heart rate, body temperature, movement, facial expression and blood pressure. Unions said they fear that employees could be dismissed on the basis of a computer’s assessment of their physiological state.
Technology allowing constant monitoring of workers was previously limited to pilots, firefighters and Nasa astronauts. This is believed to be the first time a company has proposed developing such software for mainstream workplaces.
Data, say those who process it for a living, expands to fill the space available.
In the same fashion, innovation proceeds at the pace allowed by the infrastructure. Which means the US is screwed.
In Japan, reports the Washington Post, broadband internet access speeds are eight to thirty times faster, not to mention considerably cheaper, and a similar statement applies to South Korea and much of Europe. I recently read about a minor scandal in the UK involving inaccurate claims of access speeds:
…Which? claims that while many [UK] companies advertise speeds of up to 8Mbps (megabits per second) or faster, consumers are achieving an average speed of just 2.7Mbps, while some have experienced speeds as low as 0.09Mbps.
If you’ve ever seen speeds greater than about 2Mbps, or 2000Kbs, please post your ISP’s contact info. In my house we have Comcast cable, and although it’s way faster than our DSL was, the highest speed I’ve ever seen is about 1800Kbps (1.8Mbps), two-thirds of what the Brits are averaging.
Then there’s Japan, where DSL is five to ten times as fast as cable in the US. Apparently we had a hand in that:
The copper wire used to hook up Japanese homes is newer and runs in shorter loops to telephone exchanges than in the United States. This is partly a matter of geography and demographics: Japan is relatively small, highly urbanized and densely populated. But better wire is also a legacy of American bombs, which razed much of urban Japan during World War II and led to a wholesale rewiring of the country.
Another application for the B-2; what foresight we showed in building it!
Problem is, our wonderful bombers are not as powerful as our stupid corporations and their “We’ll do it if you don’t regulate us” shtick.
In 2000, the Japanese government seized its advantage in wire. In sharp contrast to the Bush administration over the same time period, regulators here compelled big phone companies to open up wires to upstart Internet providers.
In short order, broadband exploded. At first, it used the same DSL technology that exists in the United States. But because of the better, shorter wire in Japan, DSL service here is much faster. Ten to 20 times as fast, according to Pepper, one of the world’s leading experts on broadband infrastructure.
Indeed, DSL in Japan is often five to 10 times as fast as what is widely offered by U.S. cable providers, generally viewed as the fastest American carriers. (Cable has not been much of a player in Japan.)
So how did Nippon Telegraph and Telephone Corp. respond?
With the help of government subsidies and tax breaks, NTT launched a nationwide build-out of fiber-optic lines to homes, making the lower-capacity copper wires obsolete.
“Obviously, without the competition, we would not have done all this at this pace,” said Hideki Ohmichi, NTT’s senior manager for public relations.
His company now offers speeds on fiber of up to 100 megabits per second — 17 times as fast as the top speed generally available from U.S. cable. About 8.8 million Japanese homes have fiber lines — roughly nine times the number in the United States.
The burgeoning optical fiber system is hurtling Japan into an Internet future that experts say Americans are unlikely to experience for at least several years.
But we have the best health care system in the world, right?
If you love ads on your web pages, and your computer browses your favorite sites at blazing speed, you probably won’t care about this post.
As I’ve been configuring my new Linuxes, I’ve come across two tricks that have improved my browsing experience quite a lot.
First, there’s Privoxy, a free privacy-oriented proxy server. It examines the requests your browser sends out, and discards those headed for known ad servers. As a result, most of the ads on the front page of the New York Times, for example, are replaced by gray-and-white checkerboard patterns. You can probably configure the replacement image, but I haven’t looked for that yet.
Privoxy works on my Ubuntu and Kubuntu desktop, my XP laptop, and my mother’s XP desktop (she installed it in five minutes on the phone with me). I believe it also works on OS X but I don’t know that for sure.
Installation was trivial: download, unpack/install, set to run at startup, and start one now. Then tell the browser there’s a new proxy, and refresh the screen a couple of times. Presto! Ads gone. Didn’t have to reboot, even on XP!
Second, there’s OpenDNS. This is a free service that acts as a Domain Name Server of first resort. DNSs take a URL, more or less readable to humans, and convert it to an IP address (127.0.0.1), then dispatch the appropriate request to that address.
OpenDNS takes what you type into your address bar and interprets it with intelligence. It can correct spelling errors (craigslist.og when you mean .org), understand nicknames for sites and for actions, filter phishing sites, provide adult-content controls, and so on. But the most useful thing to me is that it speeds up browsing because they have enormous caches of recently requested web pages. If you request something that hasn’t expired, they ship you the cached one immediately, without having to contact the original site. This doesn’t matter in some cases, where your personal computer has cached the stuff; but in many cases you’ll notice significant improvements.
OpenDNS is even easier to use: there’s no software to download or install. All you have to is tell your system where the OpenDNS servers are, and you’re up and running. To use some of the features, you have to register, but it’s free and all you need is a working email address.
Obviously I have no monetary interest in these products, since they’re free. I don’t know anyone working for the companies, as far as I can tell. But I’ve had good experiences with them so far.
In the battle against the Great Satan, another blow has been struck.
Yes, Bill Gates has lost another sucker. My desktop is now pure Ubuntu Linux. Not completely on purpose, mind you, but nonetheless happily.
About a year ago, I tried to make my PC a dual-boot system by installing Debian Linux (Ubuntu is based on Debian). I encountered a problem trying to partition my disk, and didn’t find a way around it.
This time, headed for the same goal, I met a somewhat similar problem. Last time the partition manager wouldn’t partition; this time there were two extra partitions that I didn’t recognize. One was small, about 30 megs, at the beginning of the disk; the other was about five or six gigs, which appeared in the defragmenter to be an unmovable file. This sounds suspicious to me: how did I get a partition in the middle of the disk, and what the hell’s on it? I couldn’t see it with Windows Explorer or in Command Prompt.
Well, while pondering this question, I decided to put in my new Linux partition, which worked fine but left two portions of the disk marked Unusable from the partition manager’s viewpoint. In trying to fix that, I did the right thing to the wrong partition, and destroyed my XP installation.
Of course I’d backed up all my data (except, as it turns out, my .emacs file and my Opera bookmarks, reasonable copies of which were on my laptop). I didn’t lose anything but a few minor changes I made in the day between backing up and screwing up, and I remember them. But, given the way Microsoft has chosen to distribute their massively expensive (they make 80% profit margins on Windows and Office) and massively flakey software, Dell doesn’t ship a Windows CD with its new machines. I’ve bought three computers in the last two years, and none have come with CDs to re-install Windows. I went to the Dell website to see if there was any information about reinstallation that might help; but to get any serious answers, I’d have to pay them.
The funny thing is, once I wiped out XP, everything was a snap. From the time I realized I’d dumped XP, through the install of Ubuntu, to the moment I was on the net running Firefox (the default browser on Ubuntu) and downloading Opera was about an hour. I took all the defaults; they all worked; the system started up immediately, and read my NTFS files on the external disk. The Ubuntu distribution comes with media tools, office programs (OpenOffice, which reads and writes MS-compatible files as well as many other formats), The GIMP (a Photoshop-like program), and lots of other stuff.
They fit this whole thing on a single CD, usable on most Intel 386-style computers (and the same applies for other architectures). Many computers will run Ubuntu from the live CD without installing the OS. Mine wouldn’t, so I had to download the alternate CD, which is just like the live CD except it doesn’t try to boot. The system will supposedly fit into an area as small as three gigs, but ten gigs is recommended for actual use. How much space does Windows take up on your computer (if you’re using it)? That’s why it’s so slow.
Caveats: in general, fancy graphical stuff looks like it’ll require some configuration. Flash doesn’t yet work on Linux Opera, though they say it’s fine in Firefox; I haven’t checked. Some people would consider that a limitation, others a blessing. I had to download MP3 software separately for copyright reasons, but that was trivial, and everything worked well. I especially like my new media player, called Amarok. Its music functionality is about equivalent to WinAmp, except for the visualizations.
Speaking of which, Ubuntu is a configurator’s dream. You can configure damn near everything, and the options, unlike many MS tricks, both work and make sense.
Of course no package on a single CD can include everything you might want. But with Ubuntu you get software to find and obtain packages of useful stuff. Apparently there are fancy package managers out there, but at this point I’m still using Syntaptic, which comes with Ubuntu. This program searches Ubuntu repositories for whatever you ask, then downloads and installs it. The whole system is amazingly simple to use compared to the installation packages on Windows. Just to have a single place to go when you’re looking for something to begin with is a big change.
Then there’s the fact that the stuff you’re downloading is free.
There’s lots more to say about Ubuntu, and I’ll probably be saying it in the next few days. But mainly I want to say that, once I screwed up and had to start from scratch, it was only about two hours before I was thinking, Damn, that might have been a good move on my part. I spent two days learning to make a dual-boot system, and half a day actually making a single-boot. And compared to XP, Ubuntu is blazingly fast. Everything hops, and I’m running many of the same programs (Opera, Emacs, Scid).
Oh yeah, one more thing. If you’re thinking of converting, spend an hour or two looking around the web for good places to get help beforehand; you’re likely to need it at some point, and it’ll save you time. My experience has been great with getting help. This morning I was trying to figure out the right way to hot-swap my external drive between my XP laptop and my Ubuntu desktop. I asked the original question last night; this morning there was an answer. I had two follow-ons, both of which were answered in less than 30 minutes, and within a little more than an hour we’d solved the problem. Again, a community effort, not a means of enriching a small number of persons of questionable moral character.
In between the most enjoyable visits to the dentist and the eye doctor, I’m trying to turn my desktop computer into a dual-boot system. It currently has Windows XP, which I consider the second best — or rather least worst — OS from Redmond to beset the world.
After years of abuse, Micros~1 has lost me to a system whose design is clear, if quirky; what a colleague called “hacker friendly”. As opposed, say, to the two main ones on offer in today’s market, one business model based on incompetence which engenders cheating, and the other on slipping you a drug you can’t kick even though it hurts you.
But what really has me excited is the idea of a computer that doesn’t crash, on which you install new versions — not just new applications, but new versions of the OS — without rebooting.
You might have read that Dell will soon be shipping computers with probably the most popular Linux distribution right now, Ubuntu. It seems to be a pretty conscious endorsement: Michael Dell actually runs Ubuntu 7.04, which the soap operas call Feisty Fawn, on his home laptop.
So far I’ve downloaded the installation package (.iso file), burned it to a CD, and tried to boot with it. That failed; it consistently hung at 91% of the way through loading. An hour or so with the Ubuntu docs convinced me to try the Alternate CD, which can install everything but doesn’t have the live-boot capability. That boots correctly, and may install. We’ll see.
So where are you on the religious issue of operating systems?