campus in 3D

I hadn’t taken a look at Google Earth for a while, but as I was adding a map to our department website, I noticed the “earth” button on maps.google.com – I clicked it, and got [a nice 3D view of campus](http://maps.google.ca/maps?f=q&source=s_q&hl=en&geocode=&q=University+of+Calgary,+Calgary,+Alberta&sll=49.891235,-97.15369&sspn=45.536678,109.775391&ie=UTF8&hq=University+Of+Calgary&hnear=University+Of+Calgary,+Calgary,+Alberta+T2N+1N4&ll=51.080153,-114.132461&spn=0.001491,0.004372&t=f&z=19&ecpose=51.08140626,-114.13433575,1146.33,136.785,80.987,0). And had a pleasant surprise – someone had started creating 3D models of buildings on campus. Only a handful are there now, but I’m sure the whole campus will eventually be 3D-ified. Someone’s been busy building models for downtown, as well (including cranes on the top of The Bow).

Screen shot 2010-09-24 at 11.12.56 AM.png

puppetmaster

This was sent to me by a few people who know my fondness for crafting tin foil hats to protect myself from the all-seeing eye of Google.

> ![One Password to Rule Them All](http://www.darcynorman.net/wp-content/uploads/2010/09/one-password-to-rule-them-all.png)

The [full comic is worth a read as well](http://xkcd.com/792/). It’s possibly true that the puppetmaster has no evil intentions, but that doesn’t mean that we should continue to give anyone power over all of our online presences.

On a related note, an old project on campus was recently resurrected. We took a look at the code and data, and discovered that whoever built it had designed it to store passwords in the database as unencrypted plaintext. On a lark, I tried some of the passwords against the corresponding email accounts. About a third of them worked there, too… (the app is being rebuilt by a third party consultant, after we nuke the unencrypted data so it’s safe to send to the new programmers).

William Gibson – Google’s Earth

From [William Gibson’s Op-Ed Contributor article – Google’s Earth – NYTimes.com](http://www.nytimes.com/2010/09/01/opinion/01gibson.html?_r=2):

> Google is not ours. Which feels confusing, because we are its unpaid content-providers, in one way or another. We generate product for Google, our every search a minuscule contribution. Google is made of us, a sort of coral reef of human minds and their products. And still we balk at Mr. Schmidt’s claim that we want Google to tell us what to do next. Is he saying that when we search for dinner recommendations, Google might recommend a movie instead? If our genie recommended the movie, I imagine we’d go, intrigued. If Google did that, I imagine, we’d bridle, then begin our next search.

and

> Jeremy Bentham’s Panopticon prison design is a perennial metaphor in discussions of digital surveillance and data mining, but it doesn’t really suit an entity like Google. Bentham’s all-seeing eye looks down from a central viewpoint, the gaze of a Victorian warder. In Google, we are at once the surveilled and the individual retinal cells of the surveillant, however many millions of us, constantly if unconsciously participatory. We are part of a post-geographical, post-national super-state, one that handily says no to China. Or yes, depending on profit considerations and strategy. But we do not participate in Google on that level. **We’re citizens, but without rights.**

Read [the whole article](http://www.nytimes.com/2010/09/01/opinion/01gibson.html?_r=2). A fascinating take on Google and cyberspace, from the guy that invented the word cyberspace.

also [via Brian Alexander’s Infocult](http://infocult.typepad.com/infocult/2010/09/kafka-glands-and-human-coral-reefs.html)

Bruce Schneier on privacy, security, control, and google

[Bruce Schneier](http://www.schneier.com/) speaks at the 2010 EWI Cybersecurity Summit.

Granular explicit control over privacy is unnatural…

Electronic commerce produces data. Everything we do produces data. (in ways traditional cash-based commerce did not)

Businesses and governments are forcibly changing social norms. Who gets to make the rules?

**We are not Google’s customer. We are actually Google’s product, that they sell to their customer.**

Data is the pollution problem of the 21st century.

Google and the UN Human Rights Declaration?

With the CEO of Google [declaring privacy as a thing of the past](http://www.networkworld.com/community/blog/google-ceo-schmidt-no-anonymity-future-web) (along with [Zuckerberg](http://www.readwriteweb.com/archives/facebooks_zuckerberg_says_the_age_of_privacy_is_ov.php) and Facebook), how do we reconcile that with the [United Nations Declaration of Human Rights](http://www.un.org/en/documents/udhr/index.shtml#a12), specifically [Article 12](http://www.un.org/en/documents/udhr/index.shtml#a12):

>[Article 12](http://www.un.org/en/documents/udhr/index.shtml#a12).
>
>No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honour and reputation. Everyone has the right to the protection of the law against such interference or attacks.

Google thinks it’s a good idea to interfere with the privacy of every internet user on the planet, so that it can continue to make billions of dollars by selling the information they gather, as well as using that information to better target advertising.

The UN, and nations that are signatories of the Declaration, think that privacy is kinda handy.

Which side should we be backing?

Google predicts the end of privacy

[Alec posted](http://twitter.com/courosa/status/20679520356) a link to [an article about a presentation made by Google CEO Eric Schmidt](http://www.thinq.co.uk/2010/8/5/no-anonymity-future-web-says-google-ceo/) at the Technomy conference (the article was a repost based on [the original ReadWriteWeb article](http://www.readwriteweb.com/archives/google_ceo_schmidt_people_arent_ready_for_the_tech.php) on the presentation). It includes gems such as:

>If I look at enough of your messaging and your location, and use artificial intelligence, we can predict where you are going to go.

which ties into my thinking about triangulating disparate bits of gathered information to build comprehensive profiles on anyone.

and this:

>”In a world of asynchronous threats,” said Schmidt, “it is too dangerous for there not to be some way to identify you. We need a [verified] name service for people. Governments will demand it.”

So Google’s pulling the “post-9/11 world” card? unfrackingbelievable.

The only sane response to Google is that they can have my privacy when they pry it from my cold, dead hands.

Google, the company that makes BILLIONS of dollars by monetizing what it knows about every internet user on the planet, does not get to unilaterally call an end to individual privacy.

I think it’s safe to stop repeating the clearly meaningless “[do no evil](http://www.google.com/corporate/tenthings.html) ” company motto now.

**update**: it struck me while riding to work this morning: I wonder if there are doors on the stalls in the Google Executive Washroom. Or do they only want to kill privacy that could make them money?

I think there clearly needs to be some form of regulatory oversight put into place before it’s too late.

Googlethink – displaced agency through the cloud

>Software programmers are taking the displacement of personal agency to a new level. Relentlessly focused on making their programs more “user friendly,” they’re scripting the intimate processes of intellectual inquiry and even social attachment. We follow their scripts when we click on one of Google’s keyword suggestions, and we follow them when we select from a list of categories to describe ourselves and our relationships on Facebook. These choices are convenient, but they’re not our own. They’re generalizations masquerading as personalizations.

I’m not sure RMS could have predicted this, but the pattern is basically why he is/was so emphatic about free software and being able to run the whole stack yourself.

More tinfoil-hat thinking, but we already know that Google uses your location and other data to refine search queries in real time **as you type them** – what else is being done by these algorithms? This makes for a pretty powerful, realtime, citizen monitoring platform.

As well, the selection biases coded into the algorithms shape what we can and can’t see, and therefore, what we can and can’t think. This is a far more powerful form of (potential?) censorship than outright banning sites, in that it’s invisible, and we have no idea what’s going on behind the curtain.

from *[Googlethink – Magazine – The Atlantic](http://www.theatlantic.com/magazine/archive/2010/07/googlethink/8120)*

precisely what we’re building…

A fascinating (and very long, but worth it) post by Steve Steinberg, [on artificial intelligence](http://blog.steinberg.org/?p=11).

>If we were trying to build a true, general AI, we would first need to create a way for it to get around and interact with the larger world. And we would need a system for rapid knowledge acquisition, so that we wouldn’t have to manually explain every detail of how the world works.
>
>Which, of course, is precisely what we’re building.
>
>– Steve G. Steinberg, [new developments in AI](http://blog.steinberg.org/?p=11)

I’ve always had the nagging feeling that Google isn’t **really** about search or advertising, and that it’s quietly building the infrastructure to feed a growing AI critter.