April 27, 2005

Chip's A Yahoo!

One of the odder side effects of working closely with somebody else for nearly 20 years, which you only discover by not working with them for a while, is that a small but important fraction of what you know ends up being actually stored in the other person’s brain. I encountered this strange phenomenon in early 2003, after our most recent startup, State Software, ignominiously cratered in the face of our principal investor’s feckless amateurism as a venture capitalist. Suddenly faced with the need to get real jobs to put food on the table, our heroes were forced to take separate paths. Randy (after some exciting adventures that, as Michael Flanders says, we’ll tell you all about some other time) landed at Yahoo!, and I wound up in my present job at Avistar. It was after settling into the new job that I experienced the curious and disconcerting sensation of not being able to access some of the stuff I knew I knew, as it was in a different head 15 miles or so to the south. (I’ll let Randy speak for himself as to whether he experienced any analog to this weirdness.)

Thus it is that I am thrilled to announce that after next week I shall put down my hammer, tweezers, astrolabe, and other code refactoring tools at Avistar and become instead a fellow Yahoo! alongside my long-time collaborator.

Now nobody will be safe.

April 23, 2005

Prescience?

An addendum to Randy’s observation below. This triggered a memory of something our buddy Crock once wrote. He said:

There are three positions you can take on inevitability.

  1. Passive ignorance.
  2. Futile resistence.
  3. Exploitation.

Sony is moving from Position 1 to Position 2. eBay is in Position 3.

He was talking about Sony’s announcement that they were going to ban the sale of characters from their online games. This was in April, 2000.

But, as Randy said, just because they’ve decided to embrace reality doesn’t mean they’ll necessarily embrace it successfully.

April 20, 2005

Sony Slides to the Bottom of the Slippery Slope

Last October in my KidTrade paper, I asserted that eBay virtual goods markets are the direct result of design choices that have important (and potentially harmful) side-effects. Not all virtual economies need follow the same path. But some companies continue on boldly… In part, I wrote:

From Twinking to EBay:
The MMOG Virtual Economy Design “Slippery Slope”

  1. Gifting → Twinking
  2. Gifting + Multiple Chars/Server →  Muling
  3. Gifting + Messaging + Trust →  Trading
  4. Trading – Messaging – Trust + In World Machinery →  Robust Trading
  5. Robust Trading + Scarcity + Liquidity →  External Market (eBay)
  6. External Market – Trust + In World Machinery →  GOM

It seems that Sony has embraced this inevitability and has announced that Everquest II will take the final step on the slippery slope and create an online market for users to exchange real-world money ($$$) with virtual goods, within the game.

I guess that’s one way to handle an economic design that leads to farming – rather than fix it, ‘legitimize’ it. :-P Honestly, a system that has a market like this should be designed from the ground-up to mitigate abuse and manage production rates. This feels so much like:

Ready…
Shoot!
Aim…

I can’t wait to see the TOS for using that market – this is a very risky play.

[Discussion pointer: TerraNova]

April 5, 2005

Stretching the Lessons

A response to Marc Hedlund’s Reading Yahoo! 360° through “The Lessons of Lucasfilm’s Habitat”


First, I must say that I’m personally flattered that Marc Hedlund and Clay Shirky think that The Lessons of Lucasfilm’s Habitat is “fantastic” and “Best. Essay. EVAR.” I’ve never been called a hero before. :-) Frankly, the paper is showing its age, so I was quite surprised that Marc chose it to frame his thoughtful and lengthy critique of the Yahoo! 360° service. You see, in the spring of last year, Chip and I started this blog: Habitat Chronicles in part to archive and transcribe the “Habitat Redux” (ppt) presentation, where we take our original paper to task and talk about the new lessons learned since then. Chip and I haven’t transcribed everything yet, so that Marc may have missed some of the hindsight, reinterpretations, corrections, and retractions is understandable.

Despite our best efforts at clear communications, The Lessons of Lucasfilm’s Habitat has become a bit of a social software Rorschach Test: In it, people see some wisdom that was not intended, or was quite accidental. Marc’s critique does suffer a bit from this effect and in a few places even acknowledges it, but he goes further as he re-interprets several of the lessons.

I welcome all thoughtful critiques of my work (new, and old), and think that several of Marc’s prescriptions are correct for Yahoo! 360°. But, since his critique used The Lessons as framework in ways that I find personally challenging, this seems like the right place respond.

Identifying the Customer

Marc says:

… Yahoo! Mail or MSN Passport is a bad idea, since the service’s goals will diverge from yours over time … an email address in your own domain will serve you better than one @yahoo.com. Great stuff, especially now that domain name registration is effectively free.

This statement typifies an attitude that separates us digirati from the rest of the world.

My mother [sister/nephew/etc.] can barely manage her web-based email. How on earth would she manage her own domains, servers, and what would motivate her to cough up the additional money to pay for these services? I run a server farm from my garage but can’t get AOL to accept a connection from my mail server. Likewise, pobox.com bounces all mail from my machine as coming from a dynamic domain. I get spam from people trying to trick me into using their domain registry with deceptive mailers that look like bills. Why was my own email domain good idea again? There is a definite cost/benefit trade-off calculation here.

It seems presumptuous to tell everyone that the free web-based email provider that they use is not good enough – that they should pay for so called ‘effectively free’ services that are significantly more expensive and complex – when all they want to do is send baby pictures to their relatives. The presumption that you understand other people’s cost/benefit calculus better than they do was at the heart of the final and most important of the Lessons: Get real. For millions of people, web mail is plenty good enough.

Stretching The Lessons

How better, then, to look at Yahoo! 360°, than to take these lessons learned nearly two decades ago, and apply them to the brave new project? Yes, let us arrange the deck chairs to spell Habitat, and see how they feel.

  • A multi-user environment is central to the idea of cyberspace.

On the surface, Yahoo! 360° seems to have learned this in kindergarten. But the world is different than it was then, and there is not one Habitat, but many. It connects you to your friends with Yahoo accounts, and not to any other friends you might have…

We’d like to believe that eventually we’d be able to share with any friend the same way we can email with any friend, but look at Yahoo Messenger and its competitors, and that’s not the future you’ll see.

Yahoo! 360° is built fundamentally on the idea of sharing – what is more multi-user centric than that?

During Marc’s visit to preview the project, the development team shared that more integration with non-Yahoo! services was consistent with the goals of Yahoo! 360° and already on the product roadmap. Look at the recent release of web service APIs, deep support of RSS over all its services, acquisition of Flickr, and initial availability of RSS feeds on public blogs for evidence of this commitment. It is not a matter will or understanding that it needs more integration. It is only a matter having the resources and time to implement.

  • Communications bandwidth is a scarce resource.

Lessons talks literally about modem speed when it talks about bandwidth, but it also talks about attention bandwidth…Today, the problem with attention bandwidth is the number of applications that want my daily attention…RSS comes out of 360 only through blog postings, one feed at a time (no “show me blog posts by all of my friends,” as Flickr would have it); other system messages aren’t available at all without logging in…

The original lesson doesn’t mention attention at all, but I’ll take the bait anyway :-).

Yahoo! 360° is designed with the idea of reducing to one the number of places you have to go to get the latest photos, blog entries, reviews, favorite songs, group postings, messages, etc. That it currently fails for his case is a concern, but already there are customers that are telling us that we have significantly reduced the number of separate sites that they have to ‘hand out’ to their friends and family.

  • An object-oriented data representation is essential…

…360 represents data in HTML only; no access allowed unless you’re logged in through a browser…[The Lessons] hint at a web services API. My usage of my data might be arbitrarily elaborate; allow me to communicate with the service at a behavioral level rather than through your presentation…

Stretching this lesson to talk about web services APIs is probably appropriate, but not exactly the original context. I still believe that online services should offer object-level semantic interfaces, where it makes sense. And it makes sense for Yahoo! 360°.

Yahoo! 360° will have more feeds and APIs as soon as we can get them working. I can’t think of a service this comprehensive that had full web services APIs on their first day of Beta. Even Flickr took the better part of a year.

  • Detailed central planning is impossible; don’t even try.

Relationships between people in 360 — the social networking part — are bi-directional, as with Friendster and Orkut…

Sharing and the free flow of communication are not built around awkward social questions. Let me browse around, see what there is to see, and choose the things I like enough to want to see more. Let me send out my ideas and my pictures and whatever else to people who haven’t joined and haven’t linked to me, and may never…

The original lesson was a call to let the users create their own content, instead of depending on a single content creator (aka the bottleneck) to meter out the experience in a linear form. As originally written, Yahoo! 360° exemplifies this goal: There is only user-generated content.

As to the critique of bidirectional links: there is much debate on this topic – there are some things (such as permissions management, gathering recommendations by degree, etc.) that are simplified for users by establishing these kinds of relationships. Though no one in my family, nor any of my strong ties, has any problems with this connection structure, the desire to observe other’s public activity anonymously is a valid request and is currently enabled for public blogs via RSS feeds. The product team is considering other one-way connection schemes as well.

  • You can’t trust anyone.

A central point of 360 is its controls for permissions and… the privacy interface creates an expectation that it will actually do something to protect my privacy… The idea that a site like Yahoo can give me actual control over the distribution of my ideas and photos with a popup menu is absurd on its face; if the idea is sharing, then sharing will find a way. You can’t trust anyone but the promise of the site is that you can.

The original lesson explicitly limits its scope to client-server interactions, like an API. It is a software security lesson, not a distribution of information lesson.

As to the claim that Yahoo! 360° protecting your information is “absurd on its face”, I beg to differ and propose a challenge to prove my point: Marc – Please send me a copy of pictures of the gift my daughter made last Christmas for a friend. They are available through Yahoo! 360° and Yahoo! Photos and are protected only by “popup menu” settings. You do not know who has permissions, and even if you can guess, they don’t trust you.

Do Linux file permissions (set from an even more primitive command-line interface) mean that those files aren’t protected? Come on.

Marc closes with…

What 360 gets right is that it isn’t about anything in particular; it’s a blank piece of paper with some wrinkles and lines. What users do with it, even in its closed-garden, api-less, feed-free incarnation, is their own choice, and they may well find a way to make the tool better than its designers originally intended.

This is no accident. Yahoo! 360° did learn Lessons from Habitat (and hundreds of other projects, papers, and products). What we didn’t do was get every feature/integration we wanted working before we put it out into the market.

But, the proof is in the pudding – watch the way Yahoo! 360° evolves and see for yourself.

F. Randall Farmer

March 29, 2005

Yahoo! 360 Invitation Only Beta

Yahoo! 360° has entered an invite-only beta and you can read my description of what makes it special on the Yahoo! Searchblog.

I’ve already sent out a few invititations, but I’m sure I’ve missed a few of my friends out there who are interested. I’ve got a limited number of invitations, so I’m distributing them only to folks I actually know (in person) and/or are related to. That means you have my email already – just drop me a line.

[I’m wondering if/when the invites will show up for sale on eBay…]

Into the breach!
Randy

March 20, 2005

Another Reason Randy Has Been Busy

I’m been championing Yahoo’s just announced aquisition of Flickr.

Flickr + 360?
Flickr + Search?
Flickr + Groups?

Use your imagination…

Things are getting a lot more interesting for communities at Yahoo.

March 15, 2005

Announcing Yahoo! 360°

Since we are both too busy to keep this blog up-to-date, I figured I’d share what it is that I’m doing that would keep so occupied that I can’t keep up with the much needed ranting about virtual worlds, avatars, secondary markets, security and the like…

Announcing: Yahoo! 360°
Press Release

As it hit the wires in the wee hours of 3/16/05:
WSJ:Yahoo Plans Service to Let Users Create Blogs and Share Content
Charlene Li: (Forrester) Yahoo! announces blogging and social networking betas
AP:Yahoo Tests Blend of Blogging, Networking

Early bloggers are intrigued:
Marc Canter
Micro Persuasion (Steve Rubel)
SearchEngineJournal (Greg Sterling)
and hundreds more.

The invitations start at the end of the month…

November 24, 2004

Thanksgiving Lasagna

For reasons which entirely elude me, it has become the tradition in our household for me to cook a lasagna for our Thanksgiving dinner. I think it may have something to do with giving my wife an excuse to make me cook, but in any event I make a pretty good lasagna.

The recipe I use is one of those recipes that has been passed from friend to friend over the years, mutating ever so slightly at each step. My contribution to this process came about 25 years ago when I was first learning to cook for real. My wife (then girlfriend) was teaching me, but I tried to lean on her assistance as little as possible, mainly because I tend to favor the Jump Straight Into The Deep End school when it comes to learning new things. This recipe seem pretty straightforward and self-explanatory, so I didn’t ask for a lot of advice. However, being a novice, I was unclear on the distinction between a clove of garlic and a bulb of garlic. The recipe called for a clove, but I chopped up a whole bulb’s worth and put it in. Janice, who insisted, despite my whining, on periodically checking my work in progress, tasted the simmering meat sauce and pronounced that it was pretty good but that it needed more garlic. I protested, and insisted that I had already put in an entire clove, holding up a garlic bulb as illustration. It was at this point that the magnitude of my error became clear. We learned three important things that day which helped draw our growing relationship even closer: (1) she really likes garlic, (2) I really like garlic, and (3) you know, it’s really pretty hard to have too much garlic, isn’t it?

Chip’s Lasagna

Meat sauce:
3 lbs. ground chuck (can substitute ground turkey for cheapness or leanness)
1 large bulb fresh garlic, minced
1+ tsp. Italian seasoning
1+ tsp. oregano
1 29oz. can or 2 15oz. cans tomato sauce
2 6oz. cans tomato paste

Brown meat slowly. Spoon off fat. Add other ingredients. Simmer at least 1/2 hr. uncovered (longer is better).

Cook 8oz. lasagna noodles as directed on package (rinse in cold water)

Cheese filling:
~2 cups Ricotta cheese (1 lb. carton should suffice)
1/2 cup grated Parmesan or Romano cheese
2 tbs. parsley flakes
2 beaten eggs
1/2 tsp. pepper

Mix.

Slice 1 lb. mozzarella very thin

Put 1/2 of noodles in 13″x9″x2″ pan.
Spread with 1/2 of cheese filling.
Cover with 1/2 of mozzarella and 1/2 of meat sauce.
Repeat layers.

Bake at 325 degrees for 45 minutes.
Let stand 10-15 minutes before cutting; filling will settle slightly.

Eat!

October 29, 2004

SOP II Presentation Slides

As I feared, many at State of Play were speaking about virtual objects as if they were user’s personal real-life property. The general reasoning is:

Secondary Virtual Item Market implies Real World Value and that implies that it is Real World Property and, by extension, subject to real world property laws.

Refuting this position is why I prepared the KidTrade position paper and released it two weeks early. I didn’t want to spend my time at the conference talking about KidTrade proper, but use it as one of several proof-case(s) that the the formulation above is an insufficient to rationalize the Real World Property hypothesis. I was able to achieve this goal.

The powerpoint slides I used as notes are available, and as an added bonus the deck includes the “History of Habitat” slides I presented at the evening event at the Museum of the Moving Image, complete with screen shots.

UPDATE 11/12/04: The video archives are up and you can watch my panel.

October 16, 2004

The Vision: Cyberspace Protocol Requirements

In 1993 Electric Communities was a three-person consulting partnership: Randy, me, and our former Lucasfilm Games coworker and frequent collaborator, Doug Crockford. We had several clients, but by far the biggest and most important was Fujitsu. Fujitsu had hired us to do a business plan for bringing Habitat back to the United States, starting from the Fujitsu Habitat system they had developed in Japan. This was the project that eventually became WorldsAway (and which they eventually hired us to staff and manage during its formative years).

We, for our part, were far less interested in Habitat per se, which for us was yesterday’s thing, than we were in newer, more cutting edge stuff. The biggest limitations in the Habitat architecture, from our perspective, were that it relied on a unitary, centralized server, and that the environment was not user extensible. We wanted to put together a system which overcame these limitations.

We had some ideas about how to do this. We were heavily influenced by lengthy discussions with Dean Tribble and Mark Miller, who introduced us to the wonders of the capability paradigm for security and optimistic computation for implementing distributed systems. Dean and Mark worked at Xanadu during the time Randy and I worked at American Information Exchange. We got to talk to each other a lot because, for about five years, AMiX and Xanadu shared offices in Palo Alto (both companies being part of the Greater Autodesk Coprosperity Sphere at the time). But that’s another story.

Our arrangement with Fujitsu was that they would pay us to spend half our time working on WorldsAway and the other half doing advanced conceptual studies, planning, and prototyping for a next-generation system. They’d own the WorldsAway work and we’d own the conceptual stuff, though they would have the right to make use of the latter however they wished. It was a pretty sweet deal, especially when viewed in the light of these post-bubble years. We got to sit around and think great thoughts about pretty much whatever we damn well pleased while getting paid rather well for our trouble.

For the most part, I worked on the conceptual stuff (with a little bit of WorldsAway), Randy worked on WorldsAway (with a little bit of conceptual stuff), and Crock worked on figuring out how to present all this weirdness to the people at Fujitsu. After a couple of years of producing various presentations, studies, and reports, the ultimate end product of this effort was a hefty document gradiosely entitled Cyberspace Protocol Requirements, which we post here in public for the first time. (Note that although this is the final revision of this document, dating from February, 1995, the principal work was published in the March, 1994 in a form only very slightly different from this.) This document is really quite embarrasingly dated, but we decided it would be worthwhile to get it on the record. It’s probably also worth pointing here to the manual for Dean Tribble’s Joule language, which was included as an appendix in the dead tree version of the Protocol Requirements document. Joule was our first round pick for the foundation language for the infrastructure we proposed building.

An important piece of historical perspective to keep in mind, should you actually attempt to read this: there is one seriously major development that had not yet taken place at the time this was originally drafted, namely, the rise of the Internet. The Internet was just beginning its explosive takeoff at that point; indeed, we allude to this. However, though people certainly expected that some kind of dominant, global, electronic network was going to be the wave of the future, at the time the Internet per se did not appear to be the foregone conclusion that in retrospect it obviously was. It was a time when all the various large media companies (phone companies, television networks, cable TV operators, etc.) were announcing mergers and new strategic alliances almost daily, in an absurd and furious spectacle that we referred to as “The Dance Of The Dinosaurs”. Nothing concerning the look of the future was very clear. It was a time when people were still getting used to the idea that commercial use of the Internet was even legal (which it hadn’t really been for most of its existence up to that point, since it had been a creature of the world of academia). In particular, we were getting along with FTP and Gopher and none of us realized the the World Wide Web would be the big winner. Indeed, those of us who had been associated with Project Xanadu were especially surprised by the success of the Web, since its technical deficiencies were, in our eyes, so profound that we never quite believed that anyone could ever take it seriously (as Mark Miller put it, “if we had known at the time that the thing didn’t actually have to work, we would have been done a lot sooner.”)

Another disclaimer is that there is a lot of visionary stuff in here that I either no longer believe or believe in a profoundly modified form. I’ll have a lot more to say about this in my next post, which I’ll have up here soon.