ASCII by Jason Scott

Jason Scott's Weblog

Fool Me 16,380 Times, Shame on Me —

Salutations, dewy-eyed fuckstick.

I’ve met you before, of course. I’ve met the type you are, the way you are, and your dreams you think you’re selling are nothing new.

Your whole culture is poison, shifting its language and buzzwords and goals on a whim to satisfy what you think people are going to buy into. You’ve sat back and seen what everyone else is doing, tried to figure out who is succeeding the most, and then you’ve done exactly what they are doing, except different. You’re nothing new but everything you are depends on convincing me you’re something new. You’re not.

You stand at the parties and the paid-for open bars and the shitty trailer-display fake-clubs and act like you want to be there, to find the right people to talk to, and make all the right connections. You don’t even know how dead your eyes look. You probably don’t even have a clue how you sound anymore, with your squeezed-together meatball of catchphrase and cool-not-cool references meant to pull in money and fan-friends like some sort of evolutionary-lucky flower catches bees. I’m not that bee. I might have been your bee once, but I am so no longer your bee.

You buy space and you allocate your meager funds and you shift your stories and you scrum your agile and everything is going to be a-ok, every day, every meaningful life-moment you’re experiencing now. You have a card that calls you a made-up title and a group that calls you a made-up name and this could be anything, anywhere, and you will ride it as surely as you rode life to this point, even when the darkness was behind you instead of in front.

So when your fluffball fake-dream bullshit story is choking down some part of the world, well, on its own, who cares? Not everyone gets to be lead dog. Some don’t even realize dogmeat’s on the menu until it’s too late. Fine enough for me – get into your car or walk down to your cafe, or stroll past your workshare’s hope-chest of names in the directory and take that elevator to the empty toy-room of your latest squashed-name pony show. Do that all damn day, until you write some essay that reads like the Dalai Lama after nine beers about you doing something else, whether it be to join some grey battleship selling 0.9 widgets for the price of 1.1 or playing drool-parent to your un-special offspring.

No, see, that’s a-ok. You go there, sport. You ride that all the way down to the splash.

The problem, that is, the part that gets me involved, is if your lame bulging sellout craptastic shell-game touches lives. Not just the throwing of coin at food servers or your soaking up the phone-line at the local pizza joint every Thursday. No, I mean lives, and the life they live, and what you’re taking from them.

You’ve glazed over, and that’s a shame, because this is where you should be listening.

But follow-through isn’t your style. Beginnings, big bangs, huge throwdowns and declarations of success at cusp of the outset of the firsts – that’s your place, and where you live every damn day of the journey.

But, see, your story is as old as the hills, as dead as the moon, as tired as the second snooze alarm. And I’m tired of you telling it.

It’s the end, you see, and that’s when I get involved. The end. The big goodbye. The moment when you realize you aren’t pulling down rainbows, you’re pushing up daisies. That’s where I am these days, kicking around, holding the hands of the people stuck with you after your shining balloon fills with lead. I’m all crazy hat and loud voice, and when you most want to mourn your dream and turn to lighter lands, I’m there to drag you down where the bodies are buried.

When does it happen? When do you turn away from people you claim you worked for, loved, cared for, admired? Were those promises just words? Spoiler – they were. You don’t care for your customers, your audience, your fan-friends, your follows; any more than you care what leaf that bug you stepped on was carrying. It’s old hat, it’s yesterday.

Well, guess what.

It’s a brand new day here in reality. You have a problem they don’t teach you about in Douchebag Combinator.

The problem is me.

I’m here and I have friends, a team, an army more like, and they realized like I did what you are, that you’re just one segment of the snake but a segment nonetheless. We’re going to watch you, we’re going to hold you close, closer than love, closer than fear, and we’re going to make sure what you want to forget gets remembered forever.

Oh, you’re going to make noise. Making noise is what you do. You’ll use some nasty words, some untoward phrasing, some no-doubt effective-against-some blurbings about politeness and respect. Fine words, fine words indeed; too bad they utter forth from someone who would use their mother’s ashes to salt the plate of a million dollars, should it come to that. Like I said, I know what you are, and my friends do too.

What’s that? What’s that I hear?

Well, now that you ask, what I and all the people like me want are simple things, barely this side of demands, more like expectations… musings really.

We want respect. The real respect, that comes from people to people.

We want a right or two. The right to take back what you begged us to give you, to hold with you, to leave in your care. The right to take it back anytime. Any. Time. Even after the end.

And we want transparency, the sight into all that you are. No black boxes and dark clouds and missing words to hide the gaps.

We want no less.

It’s sad, really. It makes me sad. I’m making this all pretty clear, but I know you’re not listening. Maybe I’m speaking in the voice of the person who knows you’re going to screw this up, knows you’re just another dumb mouth spouting out the same old same old. But maybe, like a jaded player holding out at 17, that things might be a little different this time, that you might say something you never say: I’m sorry, and I will make this right.

Until then, it’ll be what it is, what it always will be. You, and me. You, and me and the people who see the clarity I do and the people who fall for the likes of you.

See you soon. And forever.


JSMESS and the Big Day —

Photo by Raj

 

On Thursday, the 24th, (1024, in computer vernacular) I was part of a raft of announcements from the Internet Archive for the coming year. Reviews of great things that happened, great things happening, wonders to be shown. If you want to watch the whole event with pomp and circumstance (surprise guest and all), then here is the video on the Archive itself.

If you want to check out just the ten-minute presentation I gave, here’s a link right to it. And if you want the summary of what I talked about for those ten minutes, here it is:

JSMESS is real. It works. It works well, very well. It emulates 300 platforms and can run many thousands of programs on those platforms. It works in pretty much all modern browsers and will be continually updated.

But more than that, we have now installed JSMESS into the Internet Archive, in a collection called the Historical Software collection.

For people who follow this weblog, JSMESS and its progression from my initial call for it in October of 2011 have been a steady drumbeat, with incremental improvements the whole way along. But for a whole other range of people, this past few days have made them wander into the Internet Archive and realize, to their horror and delight, that a collection of programs are a click away from playing right in their browsers. One click, and an old machine is alive and running software. No plugins, no click-throughs, no downloading and configuring of emulators, and certainly no goddamned Java.

With the introduction of the Historical Software Collection, we have a real-world, dreams-come true experience of seeing contextual descriptions of old programs, and an instant-click window that will bring you right into the experience of that old program. From “Oh, Pitfall!” to playing Pitfall in less than 10 seconds. From then, it’s just you trying it out, ranging from a few pithy stabs at the keyboard to verify it works, to playing the crazy thing all the way through. It’s up to you! You’re not making anybody wait, you’re not trying to figure out if you got the right emulator program and where the cartridge image for Pitfall! is going to come from – you’re right there, like you are with movies, music and texts. That’s the magic. That’s the miracle.

logo

 

As of this writing, dozens of major online news outlets have spread the news, either about JSMESS or about the HSC. Most are just reporting our weblog entry about it, but a few are doing some in-depth discussion of the meaning of it all, and the experience of using old things like Visicalc or Wordstar. It is, as I had hoped, a joy to read all this, to crash-land JSMESS into the Internet Archive and watch people flip out in joy and incredulity.

Along with this comes the usual “I never head of this 15 seconds ago and damn howdy do I have something to say” Opinion Tourism that weighs into so much of online writing. Most of it I expected – very little is a surprise. I might as well answer it in one place.

A whole range is “it is better emulated in native/desktop”. Well, YES. The difference is that in one case you’re in the process of tracking down materials, installing items on your desktop, running them, finding what ELSE is missing, and then getting it going. You’ve gone from an observer to a fiddler and a hobbyist in the process. In the other case, you click on something and it is right there. We’re going to make the JSMESS emulator better, more accurate (thanks to the work of MESS) and add features as we go. This was the big push to get this going for the big announcement, but we have a long way to travel. But we are travelling – everyone’s excited to get JSMESS better and better. But we obviously will always be one step behind programs not fighting the browser paradigm and running at a reduced speed. The key is to make that step as tiny as possible. Where we will always be ahead is Time-To-Action – and that was the big reason for doing this. The day when you can say “Check this out” to someone and within seconds they are checking it out is now here. That is a very big day.

Another range of discussions is “awww, only 30 programs, what about _____”. A lot of the reason for this was that I wanted JSMESS and the entire paradigm of a software museum to be highly, highly curated. I wanted it an impeccable collection of unquestionably seminal and pioneering works. I wanted there to be no question that, sans the fact that it has no physical space and can be called up on anything from a phone to a tablet to someone’s laptop – it was a truly valid place to learn about computer history. So we started small. On a technical side, the methods we used to add the programs took a lot of custom work, and I’d like to see us remake this down the road so anyone can add their old software up and have it boot. I promise you – within a year, this collection will be in the thousands, far beyond what one reasonable person would ever need. It’ll be a place that historians and educators can point to for people to experience these items and comment on them.

Skooldaze

A particularly nice moment has been to come back to the MESS coders, an amazing and intense group of people, and show real results. JSMESS is now something that, with some luck, will be accounted for in future MESS versions, with a few code modifications and other cleanup work being brought back. Contributors are helping on both sides, and I think the projects with both benefit from that. We’re currently 2 years and multiple versions behind the latest version of MESS – the goal is to make that disappear.

People have already come to me, inspired by all this, to talk about contributing emulation code and support to the MESS project to ensure JSMESS supports their favorite machines and supports them well. That is exactly what I had hoped for – the feedback loop of seeing it arrive in your browser window, and knowing your efforts will result in something millions can use instantly.

It is absolutely a brand new day for software preservation. The question that now arises is – what is going to happen?

6t86il0347

 

I have been to a lot of places over the last couple of years to tell people this was coming. We can’t go back – it exists, it’s up on github, it’s everywhere. People know, by the tens of thousands (and soon the hundreds of thousands) that this exists. They like it, they want it, and they want it smoother, faster, better.

I’ve talked with institutions about this approach of mine – it’s one of several out there, and I think it’s one of the slickest, but either way they come to it, institutions now have to think about how they will approach emulation of these programs. It’s here. It’s now. It is now very interesting and very immediate. There is no barrier to entry. None. Your browser has to work. You have to have the vaguest interest. Once JSMESS is a distributable plug-it-into-your-website package, it’s going to be in a lot of places – weblogs, websites, kiosks.

The flood of new users of emulators, an entire casual class, is going to have interesting effects in the world of emulation in general. More people savvy this exists, more folks wandering in to learn about all the hard work that’s been done, and with that, new demands and wishes. And both the emulator makers, the groups trying to do emulation as a service or emulation as a standard, are going to find out that the hoped-for audience is here, right now, and they want action.

How will this brand new day work? It’s not up to me. I’m going to focus on the short-term – getting the historical collection filled with a few more dozen items people are requesting, cleaning up some of the related projects that came with this one, and working with the JSMESS (and maybe MESS) team to get things even smoother.

I can’t wait to see how it all plays out.


Scanning: Some Thoughts —

If perfect is the enemy of done, scanning has whipped out a switchblade and standing next to perfect, ready to kick done’s ass.

Scanning is probably the number one thing I deal with, in terms of a process that folks can do themselves and which everyone has completely wild ideas about. How to scan, what to scan, and why scanning should be done. It’s a morass, a mess, a jungle of weird social mores and beliefs and anger and delight.

Before I go super-depth into this subject, I’ll take the approach of a quick pop-science list so you can grab that and run into the night, if that’s all you were looking for.

Here’s a quick scanning primer:

  • If you’ve scanned something nobody has before, you win.
  • If you scan something somebody has before, but better, you win.
  • If you scan something as well as somebody before but add metadata, you win.
  • If your scanned, metadata-laden collection of something is browsable/downloadable, you win.

Similarly:

  • If you scan something worse than what’s out there, you lose.
  • If you scan something and destroy it in scanning it, you lose.
  • If you refuse to scan something because you’re scared, you lose.

Or, as Michael Pollan would say, “Scan things. As much as possible. Mostly ephemera.”

Let me spend the rest of this going into what I mean about all this.

IMG_1789

Life found me in 2008-9 in the basement of Steve Meretzky, who at the time still lived in Massachusetts in an absolutely lovely home that his kids had moved out of, and which, ultimately, he and his wife Betty Rock sold to move out west. (Sadly, the family that bought it tore it to the ground.) In the basement, Steve had collected years of material connected to his game-making history, and the history of several companies he’d worked at, including Infocom and Boffo. And I mean, he’d collected EVERYTHING – memos, scraps of paper, sketches, maps, even technical printouts, ads for materials tangentially related to game making, and invitations to parties and events he was running. He’d put them in binders and he kept the binders in a shelf. As I was making a documentary about subjects related to Steve, I asked if I could scan them to some amount.

Initially, I was in the basement. I sat down there for hours and scanned and scanned. Eventually, Steve trusted me enough to lend me the binders for an extended period of time, and I drove them home, these precious one of a kind materials, and I set up a scanning station in my office. All in all, I scanned a pile of these binders, with something like 9,000 pages in total.

I did NOT scan every single scrap of paper. I definitely did not scan all his binders.

The binders are now in the care of Stanford University, who have them available but charge for various types of access. I understand why they do this. I also know they have no immediate plans to scan them.

To scan, I used a $350 Epson scanner. The scanner was hooked up to a Windows laptop running Vuescan software. I scanned these documents at 600 dots per inch (dpi), which is relatively high. I scanned them into .TIFF files, which is a lossless format. I also made .jpeg versions on the fly as I scanned, so that there were easier to browse (but lossy) versions of the pages. It worked out to many gigabytes of material.

Now, the reason these aren’t public yet is purely a case of decency – I need to go through the pages, find every time a person is mentioned, and assemble a version of the pages with that person in them to get signoff from that person. So I’d assemble everything from or mentioning Dave Lebling, and make a multi-dozens or multi-hundreds set of Dave Lebling pages for him to consider OK. When this project happens, all the material I scanned will go on archive.org.

600 DPI is pretty intense. Here’s a scan from a page:

lgopcloseThat’s graph paper, which Steve used to do maps of his work. From this dpi, it’s possible to get a very nice closeup shot for, say, a documentary, which is why I did this work in the first place. GET LAMP has a bunch of intensely close shots of design documents, and you go in close enough that you can see indentation and pen strokes, not to mention tiny flaws in printer output and individual dots in color magazines.

Let’s step back and study this non-hypothetical hypothetical.

I’ve definitely taken a bunch of this material from the realm of “will never be seen” to “may likely be seen by fans and interested parties” by scanning it – someone could go to Stanford (or Steve’s house, way back when) and get to look at it as well, but that was a very small number of people compared to who might see it now. So, that’s good.

I scanned it at 600dpi. This means that it’s possible to get pretty close to the material in question and zoom in if text is small, or if you want to do a high-definition photo or screenshot of the material.

I put it in TIFF, meaning that the scan I did was not translated down into a “good enough” but low disk-space version of the material – you will not get lossy artifacts that JPEG provides when you zoom in. Image zoom routines might even give a pretty good illusion of higher resolution. The tradeoff in terms of space consumption is minimal in the era of the sub-$50 terabyte hard drive.

And, of course, by scanning these I made sure that information-wise, these pieces of paper and information about sales figures, design documentation, interoffice memos and whatever else are not left on a single piece of paper in a dude’s house – Stanford has a copy of the scans I did, and I have them in a number of other locations as well. So they’re “saved” or “preserved” by some definitions of the word, and definitely the ones that people use when they find out something is just lying around somewhere, like, say, a dude’s basement.

(As an aside, these binders were already getting moldy – I swapped out the binders and Steve inquired about why I did that, until I showed him how the old ones looked compared to the new ones, and he was quite glad to see the old ones thrown out.)

Let’s step to the side and talk about everything that’s terrible.

First of all, I’m using a flatbed scanner, and flatbed scanners definitely can have dust, hair, imperfections in the glass, all of which might lead to some of the images being a little bit weird looking. I might have pressed a thumb against something, and if I didn’t notice it for a while, that print might be discerned here and there if you know to look for it. In other words, a person touched this all in a non-sterile environment. You do your best – you wipe the glass, make sure to have alcohol nearby, but it’s just not perfect.

Next, 600 dpi is awesome for some things, but to throw out the originals would be a crime. I didn’t get all the items, I possibly missed the backside of a two-sided document, I didn’t mark down the metadata about how the notebooks were arranged… a whole set of information would be lost by going about it that way. And you never know when you need to zoom in a little farther…

Finally, a TIFF is not paper. A scan is not the object. The photo of something leaves you a couple steps away from the original item, no matter how good the lighting, how true the color, how right the process.

Atari_400_XL_Sears_1979The fact is, almost any scan be ripped apart depending on your level of standards, and what you think a scan needs to represent. In the item above, a certain level of people are delighted to be able to see photographs of an Atari 400 and related products, the fact it came from Sears, the prices for everything, the launch/available titles, and, maybe, the unique grille of the television set the example machine is connected to. This is all information and if you didn’t have the catalog this came from, you just got some great information.

Or you can concentrate on the smaller resolution, the fact you can see ghosting from the other side of this thin catalog page, and the yellowing/discoloring. There’s likely a dozen other optical and arrangement flaws I’m missing, but they’re there. For some, these are incalculably fatal mistakes. It moves this from the realm of captured history to cheap trinket.

Let’s break it down further.

As I said above:

  • If you’ve scanned someone nobody has before, you win.

Maybe this isn’t the ideal specimen. Assuming they didn’t destroy the catalog to get this scan, it’s still a great thing they did it. There’s information aplenty in this thing, it proves the page existed, it proves the catalog existed, that Sears sold these, and it gives you a range of pointers to dig deeper, should the mood or necessity strike you.

Of course it could be done better – you could scan it at ludicrous DPI. You could do multiple scans at various contrast/brightness levels, and recompose a top-quality version (that maybe never existed in reality, mind) that would be suitable for a poster or an art book. You could scan the other side, and using some amazing algorithmic mojo that probably killed some graduate student’s year remove all trace of the back page from this scan. You could even painstakingly RE-DO the page from scratch, using this as an artistic guide while you vector this to perfection, meaning near-infinite zoom capability.

The flaw, the miserable mistake that I’ve now seen over the years are the people who think that this scan, not being perfect, should not be done. 

I get the mails. I have the conversation. The people who sit on items that are important, that have historical heft, who are waiting for some mythical moment in time when the ability to scan something perfectly every time, conducted by themselves or by some institution paying for it, will ensure the majesty of the item be maintained forever. I know these people. They are among my people.

tumblr_mc0irzceex1qe52v7o1_500The most potent of the arguments they have against doing ‘okay’ with scanning is that the ‘okay’ scan will flood out any future attempts to scan some material, because someone “did it” and nobody will want to do it anymore. The craptastico initial version will be the “winner” and that’ll be that, the history is lost.

I happen to think this is the position taken by lonely competitive personalities.

It’s faith-based either way, since it relies on actions not yet taken or actions not yet avoided, but there’s plenty of examples of re-doing something so that it’s a better version than was before, or taking an extant item and remixing it into a more complete or contextual experience. I happen to think that doing an ‘okay’ scan (without doing an intentionally poor scan) is an excellent first step – as long as it’s paired with the approach of non-destruction.

  • If you scan something and destroy it in scanning it, you lose.

I have never seen two parties come to a conclusive agreement if one of them is a bearded nerd going “NO, NO, NO!”. But I will say that passion as regards destruction of an item is an understandable reaction. In many cases, items are bound, glued, stapled, attached, and otherwise not compatible with scanners as we tend to deal with them. There’s a decision tree there: Get a good scan as best you can without wrecking it, meaning some information doesn’t make it? Or do you destroy the pages, cut them up, remove staples, and end up with a broken not-quite-awesome pile of what used to be the item, and better scans?

Well, there’s subtlety at work here, and nerds don’t always do subtle. There’s definitely the ideal of a “frame off restoration” in the realm of cars, where you take the car completely apart, down to the frame, and fix the frame (maybe even fabricating a new one) to eventually end up with a better-than-new car at the end. Similarly, if you’re seriously dealing with an item that is so important to capture every little detail (say, the first Worldcon program or a prototype magazine that came from a publisher’s private collection), then you’re in a realm of recovery and intensity that is likely being accompanied by funding or donations anyway.

Or, another way to look at it, is that it’s “just a magazine” or “just a book”, and so pulling it apart to get a better scan might be a willing sacrifice for you. Certainly many of these materials have been thrown away, with that person doing nothing to save it. One being gutted to bring an item online might, with that perspective, be worthwhile.

It puts me in a precarious moral position, but I’m used to those: I do not like that it’s done. I will regardless take items in which this was done and use them. If one thinks of the destruction as being inevitable, than destruction plus scanning almost makes sense, and certainly scanning plus availability is the best of a poor situation.

But that’s the core issue of this entire scanning situation. I’ll limit it to what we call the “vintage computing” culture, but the scanning of these older technical materials amounts to an altruistic act in service of a murky future. It is not clear what people will use these for, or what part they will play in lives, or how valuable the information being saved represents. It is a gamble, a shot in the dark, and the question becomes, quickly, how important is it for these items to be “perfect”, or, again, whether we will be satisfied with digital copies of material without any remnant of the analog, real-world materials (paper, floppies, cassettes) they came from.

ATARI-DIGITAL-CAMERA-1

To some extent, we’re in good shape – nobody has issued a concerted effort to wipe computer history from the face of the earth, nobody has banned it, nobody murders the people behind it, and the items, while experiencing dips and valleys in perceived value, are generally considered to be “neat”, i.e., worth keeping in the attic a few more years.

There are a lot of copies of magazines out there, especially the big ones that people remember like BYTE, Creative Computing, Compute!, A+, and so on. If it handled home computers, or video games, chances are there’s quite a few copies out there and a person who issues a concerted enough effort and is willing to outlay a bit of cash in various silly directions could get a complete set. That’s not the situation with, say, event programs from computer and hacking and vintage festivals. It’s definitely not the situation with corporate memos or warranty cards or identification badges. The more we move away from “periodical”, the murkier it gets. And when dealing with people, I find a lot of folks put value and understand the meaningfulness of “magazines” or “newsletters” and less on, say, the free bookmark from a prominent computer store that existed in the 1970s.

This is why, like I said above, scanning ephemera, which doesn’t usually have a binding and which can fit in a flatbed scanner, is generally better for people to be scanning than books and magazines. In general. If you want to “help”, that’ll “help” more than anything. 600+ DPI. Use TIFF or another lossless format. Look it up.

But what if you want to scan books and magazines anyway?

In terms of what the best case situation for scanning a book without destroying it, I am, perhaps, luckier than most. I have a $25,000 book scanner in my house.

IMG_2767Last year, I requested, and got, one of the Internet Archive-created book scanners. They’re called Scribes, and they’re a masterwork of metal, glass, optics and mechanisms designed to allow easy scanning of books.

I have it installed in a room in my house. To my great shame, this year has been very, very busy and it is only recently we have it functioning to the point that I can really begin scanning books in earnest. But here it is, and it’s ready to take in books.

It has been incredibly informative to see how Internet Archive (and the books group there) have approached scanning of books, and the different advantages and disadvantages of this approach.

The Internet Archive Scribes are a great example of choosing what’s important to you and going with it, even if other stuff has to take a bit of a hit. The Scribe presents a v-shaped holder that a book is cradled on, to which two high-quality cameras take a photo of each page at the same time. The resulting pages are then stored on the server, and you turn the page of the book, and then do it again. Here’s what a station looks like:

6050098859_840db0ea29_b

The advantages are this: You can scan a book really goddamned quickly. It’s possible to do 1000 pages an hour if you’re really on top of your game and the book isn’t a finicky mess. 1000 pages an hour! No joke. You can blow through a standard 200-300 page book in about 10-20 minutes or less if you’re (again) lucky. The V-shaped cradle means you go RIGHT up to the binding of the book and you do not break the binding to do so. In other words, the original item is not destroyed.

To do this, again, a high-quality camera is taking the photo – but it is, after all, a camera some distance away. DPI can be between 300 or 600, and it will never be as good as putting it into a flatbed and letting that insane camera element slowly drift across the page, pulling in every last optical feature from a half-inch away. But it will be very good, and you will get a hell of a lot of books doing it this way. The Internet Archive is able to add a new book up every ninety seconds, 24/7, using this method.

Doing odd objects, like placemats or software boxes, are much more difficult with this setup. Magazines, especially ones with shiny paper, are also a bit of trouble. This is the tradeoff. Stick with books, and there’s a LOT of books, and you do very well by these machines. Otherwise, you work a little harder. (And sometimes working harder is quite worth it.)

The V shape design is similar to something Google uses, and something similar to what the DIY Bookscanner uses. It’s a method that requires human labor (no automatic feeder or page turner) and it results in images that need cropping and adjusting afterwards.

This is the secret sauce of the Internet Archive Scribe – the processing software is amazing. It presents pages nicely, lets you declare items as metadata, rescan images, remove broken items, do cropping to sets of pages, and otherwise get repairs going very quickly on these books before wrapping the whole thing up in a pretty bow and shoving it into the Internet Archive ingestion system, itself a wonder of programming and automatic repair. You also can declare what pages are tables of contents, chapter headings, covers, indexes.. and so the resulting item is much more usable. Without being destroyed. And only a hair less readable than a carefully-by-hand, take-all-day process of a flatbed and/or cutting the book completely up to turn it into a pile of flat sheets.

It is a wonder.

3705737339_408dcc0a2a_b

The letter that asked me to write about scanning asked for me to touch on legal issues. Here’s a hint from someone who’s actually been through the legal system: there’s no point in discussing the legal issues. No. Point.

Scan because you’re concerned that there’s no record of an item that is easily accessible to a future audience. Scan because you think you have something unique and want to ensure there’s multiple copies of it, even if those other copies are simply digital files. Give your work to people who will hold it for you. Put it up yourself. But if you’re truly asking yourself if you can do what you are doing, nothing I say is going to give you advice on that. I am neither a legal expert or a pep rally. If you’re uncomfortable with doing something you think you should be doing, look around until you find someone who is comfortable with doing that thing and give it to them. That’s all I have for you.

IMG_3362Finally, there’s some inherent question in the whole process of scanning.

Scanning is, at its most fundamental nature, a photograph. It’s an analogue, a rendering, a rendition of a thing, an item. It allows some percentage, always less than one hundred, of that item to be in multiple locations and reach a wider and more diverse audience. It is a leverage and a bargain against oblivion and elitism, where the chances improve that this item’s information and nature will travel far beyond a single place.

It is also a foundation, one can hope – a foundation of existence and reliability for an item to be expanded upon. It can be contextualized, parodied, referenced. It stands as some level of proof of an idea, and it holds itself in the center of more in-depth historical research and tales. The nature of digital creation is not one of preservation in the sense of an exacting clone of a once-offline object – it’s to create a new object imbued with the advantages of the digital world and the memories of what it once was.

Arguments and hand-waving come when different factions of creators, historians, fans, caretakers, professionals and gatekeepers all converge on the original and digital objects that came from the same source, and they bemoan the attributes unique to these objects and what advantages and disadvantages each hold. There has been a lot of wasted time and inches both printed and displayed over this fundamental nature. What matters to me, ultimately, is that human beings saw some value in an artifact that came from the past, and they want to sustain it, for reasons selfish and charitable, for the future.

That’s why we scan. And that’s why we scan again and again.

Hope this helped.

IMG_0093

 


The Game Preservation Discussion Shortcut Cavalcade —

Image22

My work in saving computer history and other materials means I’ve been spending a lot of time in a lot of rooms. In these rooms, I’m either on stage (if there is one) or in the audience. The room might also be a conference room, at which point I’m in a corner or somewhere in the middle. There’s sometimes snacks.

It’s all very nice and people don’t punch each other out and there’s been no actual deaths. All in all, there’s much worse ways to be spending time.

That said, I’ve now spent too much of this pleasant no-death occasional-snacks time discussing the same things consistently, or hearing the same discussions go by, discussions I think are pretty much settled.

It’s maybe too much to hope for, but perhaps we can shortlist this out and get onto actual actions. I’m pretty busy and I’m sure you all are too.

So, let’s go over Game Preservation, which almost always means Computer Game Preservation or Video Game Preservation or Electronic Game Preservation in the modern era. Let’s just blow through these suckers and we can stop making 20-30% of every game-history-related event I sit through a replay of this list. OK?

  • Some people don’t think Game Preservation is a Thing, Valid or Otherwise.

The effort to recognize preservation as something worthwhile to do, or to indicate games are worth saving, is a classic example of “what are we doing, really?”. Here’s what’s going on: Games are worth saving, just like any history is worth saving. In the long term, the human race will either stop paying attention to the old thing, or they’ll re-use the old thing, or they’ll worship the old thing. If you’re in a room full of people who are listening to a discussion of this nature, I’m going to bet a couple quarters that an extremely small percentage need to be won over to the idea that games are worth saving. Stop defending. Move forward.

For people who care about “getting funding” or “acquiring resources” or “applying for grants” and the rest of the fun way folks feel they need to survive doing something, the “use case” for game history and preservation being critical is inherent in it being done. If someone doesn’t buy in even with that situation, they’re never going to buy in. People know this stuff should be “saved”, whatever that means. You should be saving.

  • Game Preservation of Cassette/Disk Products is Pretty Simple.

Get a box. Make it really really big and made of metal. Inside it, put computer game boxes containing floppies or cassettes and manuals and whatever else came along with the box. For bonus points, put them in strong, stackable containers you’re going to put inside the box. Try to make sure the metal box isn’t too low on the ground (floods), stacked under some wood (fire), or in someone’s way (politics). TAH DAH YOU PRESERVED THEM

Oh, for sure, that’s pretty low standard preserving, right there – you want to digitize things, or take data off of the cassette/disk products, and a range of other activities that people do with old stuff. But stop spending time acting like “Preservation” is some Herculean act rife with pitfalls of destruction and misery, and one that needs years of study to get just right. The issue, really, is one of degrees, that is, for how long are you intending to preserve the materials, and what steps need to be taken to deal with the physical items.

We’ve been taking “stuff” and putting it “places” with a variant resulting success/failure rate over the ensuing period of time, for thousands of years. Sometimes we lose it within a day. Sometimes we don’t lose it through multiple empires. The winners and the losers are not easily picked, either. It’s all crazy, I’m the first to admit.

But if you don’t recognize that history has shown that The Weirdo With The Huge Pile of Shit has, in the aggregate, contributed a significant amount of Saving History as much as J. Pierpont Fingerbottom’s Historical Society and Teahouse, you’re not paying attention.

Furthermore, the world “gets” the saving of Physical Products. That is, if you say you have a bunch of old Coca-Cola bottles, or a collection of sewing machines, nobody’s going to say this is something brand new that has never been done, or that this is something against the laws of God and nature. Sure, some people might call it boring or a waste of time, but people do this with everything.

So, a pile of game boxes, be they CD/DVDs or game cartridges or aforementioned cassettes/floppies – great. Fine. Stop worrying.

  • Preserving Games with No Physical Product is Pretty Hard.

After sort of getting their shit together over “video games” being defined as a series of boxes containing things, the model has completely and utterly changed. What we now call the App Store model, be it Steam, iTunes, Google Play, or the like, is where everything is being sold to any great extent. Sure, there’s a couple insane top-shelf items that still get pushed out through stores, but they’re the final massive whales that need to exhaust every single sales channel possible, and are tent-pole attractors in the games industry – all their related items are then sold through the same App Store model.

There’s a lot to be said for the saving/storing/preserving of the physical products! And it’s being done! But if you think of physical products and data/online products as the same thing, trying to apply the same rules, you’re screwed. Stop doing it.

App Store products are capable of changing their nature, overnight, whether you want them to or not. They’re capable of being deleted out from under you. They’re especially prone to incremental modifications that you would barely even notice if you didn’t sit there, logging the crap out of them, saving away copies on a nightly basis to compare with the previous day’s example of these programs. The paradigm with data/online games is not physical products – it’s radio and television. It’s potentially changing, easily edited, ethereal, ephemeral non-items that might be exactly as you left them or exactly different, with your best hope being a series of accepted versions that everyone recognizes as being somewhat better than nothing, but worse than everything. That’s how it is. Deal with it. Move on to strategies of downloading and squirreling away copies of items, breaking the living hell out of Terms of Service, to ensure these materials live on in their various states.

  • Game Preservation is Not Just Preserving the Data/Physical Items.

I’ve had to sit in the room for a lot of discussion about this, so if you haven’t hopped on this train, hop on it. Games are not just the games themselves, it’s also the experience of playing the games, the stories and lore we build around them, and the life in gaming that we live. So even if you use tools to pull the data off game materials, or you save the App Store item, or you photocopy the dice, or whatever else you do, there’s a whole bunch of other material out there to collect, which is what takes so much time.

Capturing this additional material is difficult or non-intuitive or involved, but it’s vital to capturing game history. Second Life without the people is empty rooms. A Monopoly set without the family that played it is some cardboard and plastic/metal pieces with paper strewn about. Heck, two dice sitting on a table are a million amazing nights and stories that are lost if all we keep from all that is the dice themselves.

Got it? Good. That’s ten-fifteen minutes of every panel saved right there.

  • For a Whole Range of Games, We Are Entirely Fucked.

Sorry to be depressing, but there’s now so many games being made, and so many games being worked on, and so many variant ways we do gaming, that there’s a lot of stuff that’s going to disappear.

Examples? Independent games nobody bought. Online games whose servers disappeared with no warning or without a trace. Programs written inside other environments that have shut down. Notes and prototypes and unfinished works who get dragged to the trash, physical and operating system-wise, with no advocates to save them.

Many industry-style games now depend on centralized servers to stay working, and Game Companies, I’m sad to say, make extremely poor landlords. Hell, many of them make extremely poor Game Companies. And these reluctant landlords have shown they will shut down their central servers, with no export function and certainly with no transfer of the material to another more stable body, within as little as two years.

To bring this up in a discussion and then go into it for a while, to masticate all the ways that the Games Industry is a piss-poor keeper of its own history, is redundant and wasted energy. Move forward! Talk about new ideas on how to attack this unsolvable problem. While it may be possible to mitigate it, acting over and over again like this is news just cuts hours out of time you could be spending doing something.

IMG_2866It’s easy to lose the thread of what I’m saying here. These are all valid, important facts in what is the growing Game Preservation field and discussions therein, no question about it. But anyone who’s spent more than 60 minutes considering preservation of this sort, be they professionals, amateurs, or bystanders, completely get these. They’re not really up for debate anymore. They were settled like someone discussing programming knows It’s Good To Have Backups And Revision Control If Possible. Except for the occasional contrarian “I couldn’t come up with a good topic so Fuck The Law Of Gravity” presentation, I seriously think we can optimize these out of the discussions and move to The Good Stuff.

There’s a lot of Good Stuff. Let’s get to it. Otherwise, don’t be surprised when I whip out the iPad to get actual work done while you tell me preservation is hard.

 


Some Last Triage of the DEFCON Documentary —

exorcism

I need to turn to my current projects, and my other commitments. I figured after going for a month and a half into the post-DEFCON Documentary release breach, I’d write some thoughts down before moving on.

(Note, I do intend to release a fixed-up version of the documentary with a different audio encoding and with one or two factual errors stitched up. The length and content won’t really change.)

So, I guess the number one thing I wasn’t prepared for with a released DEFCON doc was the amount of people who hate it.

I mean, to be sure, the nature of the subject, and the amount of attention DEFCON receives as an event, ensured that the amount of people who’ve seen this movie and who’ve been given the chance to weigh in on opinion and reaction is much greater than my previous films. BBS Documentary has been seen by hundreds of thousands of people but it’s been a slow flame for nearly a decade to get to that number. GET LAMP is also out there but the audience is, again, people going out of their way to see it. DEFCON partially subsists off of press and attention, and so the documentary got a lot of that attention. Within a month, well over 150,000 people have seen this film in one way or another. So, the results have been more variant, and I’ve reached a whole new audience, both positive and negative.

A good indicator of dislike is the IMDB entry for the movie, which as of this writing gives the movie a rating of 5 out of 10. GET LAMP lurks at 6.5/10 and BBS at 7/10, so it looks like I’m not universally loved at IMDB. Additionally, the two comments/reviews on the DEFCON Doc entry at IMDB are vicious tear-downs, one feeling it was meaningless and the other that it was a two hour commercial for DEFCON. There’s a review or two out there of a savage sort, as well.

Let me explain why I’m focusing on this, and why it gets such attention from me – there’s two different ways you can make a creative project that people hate (or love): intentionally, or accidentally. If you make a 10 minute film with 500 murders and nobody has a speaking part, that’s pretty intentional and you can expect people to love or hate that. If you make a film about a girl who falls in love with a boy and your costume person picks random colorful laces for the boy, not aware that in some subcultures, those laces demarcate what races he hates and then you’re getting all these angry groups calling you a Nazi, well, I’ll file that under “accidental”. 

Choices were made to make the movie the way it was, which I will shortly explain, If someone hates those choices, that’s fine, I specifically made them. But I keep my eyes open for cases of people hating something that I had no idea about, because if I’m going to do three more technical films, I want to make sure I’m not going to make these stumbling choices again, if they can be avoided. So I focus. I study. I learn. That’s how somebody gets better.

So let’s go over it.

jackets

 

There were two major issues in my face by taking on the duties of a DEFCON documentary. One was the fact that it had never been done before to any great level of depth, and the other was that by the 20th year DEFCON was now so goddamned huge that it wouldn’t fit into a single film.

When faced with the second problem (large subject) before, I made a mini-series. I chose to not make this one a miniseries, because I thought that would be way too much for people who were expecting “a documentary”. With that choice, I nixed in-supreme-depth subject study of specific events and people, because there was no room. It was a tough decision, but I went with it.

The lack of any in-depth previous film (there were a few that came out before, but they were either light news stories of a few minutes, or a set of specific interviews of a few people) meant that there was a lot of expectation of how DEFCON would be covered in one. I couldn’t say I was making an updated version of a previous film – this was going to be The Big One. I doubted DEFCON would agree to do anything like this again. So it was on us to do this as right as we could the first time.

I decided to focus on DEFCON 20, with scant callbacks to how we got to 20. A whole range of people whose golden age were in years before 20 were pissed as a result of this decision. Understandable. Who wants a movie about something you gave years of your life to and it doesn’t even mention your contribution? A few floated the theory that I was intentionally cutting them out as some sort of payback or conspiracy, but that’s what people do. Hint: No.

hinch

So, given that it’s a DEFCON 20 movie, and was going to be a single 2 hour film (I’d decided on two hours way in the beginning), here’s where we went from there.

First, I knew no matter what film it would end up being, I’d need sit-down interviews with people related to DEFCON, especially people in charge of it. So I started shooting lots and lots of sit-down interviews throughout the country in the months leading up to DEFCON 20, so that I’d have footage to fall back on, instead of depending on what came out of the event. I was right on this regard, because people get really busy a month before DEFCON and at the event themselves they can’t possibly spare a moment for interviewing unless other people take on a brunt of weight (this is what happened in a small handful of cases). There went my summer, but I did end up with 50 interviews in the can, so I was set there.

The advantages of this (footage) are weighed by the fact that people are telling me what they think of DEFCON, and those people are also people who work for DEFCON, so there’s a bias all the way down the line – everyone really likes the event. This energy threads throughout the resulting movie, although it does mean that everyone seems really, really cheerleader-y. I happen to think this is an accurate emotion, that is, they really do love DEFCON and dedicate so much time to it because they love it. But it is definitely a candy-colored sugar rush that some people might not like.

In the months before DEFCON 20 happened, there was the release of a film called Comic-Con Episode IV: A Fan’s Hope. It was directed by Morgan Spurlock (although he has some very heavy-assisting producers for this and more recent films), and my producer Rachel Lovinger and I were lucky to attend a screening of the film that had both Mr. Spurlock and his producers in attendance. So we got to see a template of “what does a movie about a big event look like, and what works and doesn’t”, before we went into our mess. Spurlock and Co. were very friendly and forthcoming with answers about how they did it. This helped a lot.

So, right away, we hit the issue of structure.

How do you tell the story of an event so large? A Fan’s Hope did it by following a set of people through Comic-Con, from months before the event through to the end of the con (and possibly with some post-con wrap-up interviewing). Will the cosplayer get her costume done in time and win an award? Will the artist hopeful get the approval he seeks and maybe full-time employment as well? What of the guy who wants to propose to his girlfriend at the con – will he time it perfectly? The way that they interlock these stories, you end up experiencing the con through the eyes of individuals within it, with small general-bird’s-eye-view sequences here and there. Bear in mind, this means an enormous amount of Comic-Con does not get into the movie, but for that price paid, people watching connect emotionally with the attendees in a big way. You want the costumer to win! You hope the guy gets out the ring at the right time! You are rooting for characters in the film, while also learning about Comic-Con in a general sense as you go.

I considered this approach. I rejected it for DEFCON.

jan30

There were two reasons for this critical decision. One was political and reflective of who picked up the check, and the other was history-minded, to decide how much of DEFCON needed covering and the unique way hackers interact in groups.

Political: DEFCON is a paid-for movie. I took a fee up front and 100% of the budget was covered by DEFCON. They actually had very little oversight of the choices of filming, but the mandate was clear – cover DEFCON. Not to spend an hour telling the story of J. Crew Buckyball and his journey into DEFCON, but to have someone who’d never heard of DEFCON watch the movie, from the beginning, and leave with a real sense of what this thing was and why they might want to attend (or decide to never attend). To this end, I chose to give as much of DEFCON’s unique events and attendees a spotlight, to give a sense of what makes the event what it is.

History: DEFCON, like a lot of hackerdom, has really been taken advantage of by press and general organizations, who have derided these folks as scary, this-side-of-socipathic cyber-killing machines. If that sounds extreme, you haven’t been reading. Trust me, it was a big deal to have the film instead treat people as just normal folks with skills and pride, working together to do cool and fun stuff. I had a conversation at DEFCON 21 with someone working on a hacker-referencing entertainment product, and I said “What you need to understand is that hackers are portrayed as destructive loners, when they’re really creative sharers.” And the film bends over backwards to go that way.

What are the downsides here?

Well, by not making it about people, there’s less to personally attach to – there’s no “plot” to follow, in which you wonder what’s going to happen to your onscreen doppelganger and you wait eagerly for the wrap-up. To counteract this, I add a strong chronology to the film, showing you passage of days. Intentionally, Friday and Saturday are bloated while Thursday and Sunday are cramped. You feel the passage of time for the beginning and right at the end, when you’re least in the mood for things to take forever. For some people, the film feels long, but I guarantee it’d feel even longer if you were waiting for something, anything to tell you where we were on the timeline.

Additionally, most sequences on a subject are one to two minutes, with a couple three-minute and one six-minute sequence. If you want to get a sense of everything there, it works well. If, however, you become intensely interested in one subject (say, how the Scavenger Hunt works, or what happened to the Badges), then it moves away while you go ‘nooooooo’ in your seat. This is the core of the Devil’s Bargain – more information, less depth.

I’ve had a few people say “It’s not a documentary”, which is, of course, silly – it’s a presentation of real events, recorded in real-time, with no pre-scripting, of un-hired people doing things they would have done at an event they intended to attend. There’s nothing fabricated at all in the film, and in fact there was one fabricated scene (a demonstration of how the Social Engineering booth works) that I chose not to include for that reason – I don’t like making stuff up when reality is as interesting as it is.

So, what I think these people are really saying is “this is not the kind of documentary I wanted to see or like to see”, and I’ll give them that, whatever that means. It’s the same with people who hate the music – that’s a personal choice, and other people have begged for track lists of what is playing where. I can’t make people like music they don’t like, and I wouldn’t have chosen different pieces than the ones I did.

My occasional mistake is that I am prone to confusing someone calling the documentary bad based on their expectations or response, as opposed to an actual indication that it was poorly filmed or created. That’s on me. It looks nice! It’s a nice-looking film! Some folks just don’t like what the nice-looking film showed.

wynn

 

On the history side, DEFCON has 20 years of events, and a second event: Black Hat, the “professional” version of DEFCON that has been around almost as long as DEFCON. I chose not to include much of the previous years, and nothing of Black Hat. It was 100% space and timeline driven, not a mandate on the interesting-ness of the subject matter. That said, I had a lot more footage of DEFCON 20 than someone’s stories of something that happened 7 years ago that I couldn’t easily verify or get collaborating interview footage. Black Hat is a fascinating thing, but I thought it was spreading our resources thin to try and get Black Hat in there, especially considering it would only be covered for a couple minutes, confuse the hell out of the audience not aware of Black Hat, and then off to unrelated subjects for the rest of the movie.

Some called it a conspiracy. These people need to get out more.

I believe strongly that DEFCON deserves a book, although I’ve no interest in writing that book. A book would allow overview and discussion of the many themes and points of DEFCON and what its place in the greater hacker social structure is. The movie couldn’t do that, but it’s worth-while discussion that a 300-400 page book could tackle nicely.

Speaking of social…

My choice to focus on the partying and hanging out as much (or more) than the core subjects of the talks and presentations really rubbed people the wrong way. I’d agree with them it was annoying if it wasn’t the fact that I believe strongly that for a very large contingency of people, the parties and socializing is much more important than the talks, which are all recorded and distributed post-conference. I am sure many people wake up, quietly put on their neat-pressed T-shirts, walk into the first talk in their list, and sit with a coffee through 8 presentations, then go back to their hotel rooms and curl up with a good John Grisham novel before turning in for the night. I just don’t think those people make up the majority of the attendees. I could be wrong.

Considering the amount of partying and events that go on deep into the mornings during DEFCONs, I thought them important to portray, but it does appear a small vocal set of people wanted in-depth discussion of the subjects of the talks. I have a spoiler for those people – it won’t work cinematically. In fact, a lot won’t work cinematically that runs just fine inside people’s heads, and that inside-the-head theater people have is going to kick the ass of any film being made. It makes no sense to compete with it.

Instead, I chose a sequence with Dan Kaminsky and Renderman giving presentations (one on airplane security, the other undefined but technical), to go on a higher level about the nature of giving a talk at DEFCON. I think anything too deep would be boring, and get away from the greater lesson/demonstration being given by the movie. But the people who disagree, man do they disagree, so I’m walking away with a lesson learned there.

(GET LAMP had a similar situation where some people wanted much more in-depth description and process of people designing the games, footage that, I’m sorry to tell you, doesn’t exist. Some wanted more footage of people playing games, and that’s a much different ball of wax, and potentially may change how I do things in the future.)

kingpin

 

I could go on, into greater details, but most of what I wanted to say is here. DEFCON: The Documentary was a rather intense project, and not the kind of film I would normally want to make – I doubt I’ll make another one anything like it. That said, it taught me a ton of things about filmmaking, got me my first film crew, and resulted in a work that, for all this handwringing essay, I am very proud of and would happy watch again and again. So it’s a win across all columns.

All in all, a fun movie.

Now, let’s make more.


The JSMESS Triumph —

What an amazing few weeks it has been!

logo

 

We made improvements to improvements. We refined refinements, and refined them even more. We found shortcuts and qualities and features. And eventually, it came down to a day when an automatically running script slammed through a list of every functioning platform MESS supports and created a working or near-working JSMESS version.

More than that, we found a single line in the code, one which was meant to make the emulator work better within the browsers, but had now been producing the effect of slowing the program down. We changed a single line to say “0” instead of “60”, and to our shock, JSMESS now runs many platforms at 100% speed.

It was a fun experience to play with a Colecovision at 70% speed in the browser. Running it at 100% speed is another experience entirely – it is, as I’d hoped, a little window where you see an entire other computer running, doing its thing, accurately showing you images and visions from decades ago but breathing as alive as if they were crafted this morning.

To celebrate, I updated the JSMESS official site, purchasing a basic theme of dynamic images and transparency (since you need to be running javascript anyway), and then jazzing it up to stress how completely fun and fast JSMESS is to work with. I also now have links to all 300 supported platforms that will be in JSMESS 1.0. Three hundred!

addWe’ve got a bunch of tasks ahead of us, but they’re rapidly becoming the kind of tasks that winners have to do, that consist of the effort of the victory lap after a draining marathon, or that weigh heavy the crown of awesome.

They include:

  • Adding a virtual keyboard so that you can hit controls and keys that aren’t on whatever keyboard you’re using. It won’t be good for arcade games, but it’ll make using the machines a lot easier.
  • Going through and matching collections of software and support materials to the 300 platforms. Luckily many use similar software or are easy to track down. It’s just a lot of them, you know?
  • Finding what slows down the remaining platforms that run under 100%, and getting rid of that. Currently, our slowest platform is the Sega Genesis, which runs at 50% speed – Sonic is taking Valium and that needs to end.
  • Sound is not activated, because the sound API we’re currently using is going to be replaced with a new one, and we’re waiting on that. It should work nicely when it’s ready, though.
  • Moving closer to a distributable package that anybody can set up on their own webservers in minutes. The great thing about Javascript in this case is we can just provide you a .js.gz file and it’ll workwherever you are. There’s no dealing with binaries or libraries or anything – that’s the browser’s job.

I’ve been cranking away on a new demonstration page, where a dream comes true: you have screenshots of many famous programs, like Visicalc or K.C. Munchkin, and one click brings you face to face with these programs, running as they always have, full speed, and waiting for you.

I feel really bad when people ask me about this in person because I can’t shut up about it. It’s like the first time you realize what version control can do, or taking a new bike out for a spin, or discovering a great new way to walk somewhere – it’s exciting and new and the results feel infinite, far beyond what we’re laying into them. It’s a brilliant new day.

macintosh

 

 


The Secret Inaction of a Cancelled Documentary —

Privately, I’d been working on another documentary.

Yes, I know I’m working on three, and yes, they are very time consuming, but when a subject inhabits your brain, won’t let go, you have to let it run wild for a while.

The subject was Action Park, an amusement part in New Jersey that had one of the most amazing reputations you could imagine. It was truly dangerous, and it was also one of the most beloved places in the minds of people who went there. I went there, friends went there. Some people went there are got very hurt. A half-dozen people went there, and died.

The working title was Traction Park.

Action_Park_logo

Trust me, there’s a movie there.

If you want some of the best sources of reading up on Action Park, you’d do good to read the Weird NJ site overview of it, the Wikipedia entry about it (but read through the historical edits, as much has been shifted and removed), and there are articles about it, spread around the net (although the WeirdNJ ones, old and new, have gotten some amazing stories in the comments that others did not).

I had documentation on the park, had taken a quiet research trip to the current incarnation of the location, and had pulled in a bunch of information. Nothing was shot yet.

But now, breaking wide, are a pair of short films about Action Park, featuring the WeirdNJ guys, and with interviews with the founder’s son and employees. And some bonus footage as well.

If you want to see them, they’re here.

So yeah, that ends my plans right then and there.

Yes, ultimately, a documentary by me on this subject would have a different feeling than this short documentary pair that has came out, and I was going to go into some dark and light places they would not. You’d have teared up more during mine.

I’d have been more focused on the ideas of freedom versus safety, about how Action Park was a libertarian dreamland – a park that could be incredibly fun or kill you, where you could have the real experience of near-injury and drive home knowing you did something amazing. I wanted to interview people whose relatives had died there and people who worked at the park, along with people who considered it the best summer event of their lives. I’m a completist, as you know – so we’re talking about 50-70 interviews.

But I don’t have much time in this world, and doing a documentary that’s somewhat better on a subject that has been done before is not the best use of me or my time. I do have a lot of projects on my plate – it’s just this one burned inside, a plan.

There’s no tragedy here – not a frame of footage had been shot, and the interviews were not lined up. It’s an unmade project, one that doesn’t sit half-finished – it was never beyond the research phase.

If the DEFCON documentary had not happened, if I’d not gotten that call, I’d have probably secretly shot this movie. But DEFCON got there, and I made that instead. On the whole, I think that’s a real good trade-off. Nobody would have made the DEFCON film the way I had, and I got to be really close and enjoy the company who’ve had a huge influence on my life. I only went to Action Park once – I’ve spent the equivalent of months of time in Las Vegas around DEFCON. It was a good choice.

So yeah, now you know. Surprise!

P.S. Really, read up the history of Action Park. It’s insane.

actionpark_title


One Last Bit on JSMESS for a While —

galaxian

 

I know, three postings in a row is a bit much.

When I get into a topic, I keep going, just never stop. It’s the kind of mindset that can end up with something like the BBS Documentary – just one guy slamming away at getting things done, day in and day out.

In this case, I’m in much better shape, because I’ve got some great people helping me – Vitorio and Justin are, between the three of us, this unstoppable force of figuring out what to do to get things running.

After I started to understand the nature of the makefiles, I wrote some scripts that would generate better formatted ones, including commented-out options that might be used. It saves a lot of time. I also wrote a script that, given a string, returns all the potential modules with that string. So, everything became easier.

So, within a few days, we added JSMESS emulation for:

So we added nearly 50% more platforms within 48 hours. Yeah, we’re starting to get the hang of this.

coupepp

Now things start getting to the next level.

MESS emulates roughly 1600 machines. Of those 1600 machines, roughly 600 are unique. Of THOSE 600, the MESS term internally rates their state as to how functional they are. Apply that and it’s (again, roughly), 250 platforms that are currently emulated properly in MESS.

(What of the other ones? Some are skeleton drivers, layouts that are awaiting work, or they’re lacking some aspect that hasn’t been reverse engineered, or just suffering from people excited to do that deep down assessment needed.)

Of the 250 platforms, we should make some subset of them work in MESS. Some, in my mind, will be relatively simple, like emulating the Atari 400 (we’ve already got the 800 working and they’re similar hardware). Others, like the Macintosh family, are notably more difficult. But some are just not the best to go for in a JSMESS 1.0 sitation.

Ah, JSMESS 1.0. How I’ve dreamed of you.

So, from those 250 platforms, we’re probably looking at, oh, 30-40 of them for a 1.0. Might be more, probably won’t be less. A gaggle of computers emulated and in javascript versions for the very first time!

One of the issues is speed, and the excellent Emscripten team is looking at the JSMESS routines and determining where speed can be boosted. in the transfer to Javascript. We know the ability to hit 100% of the emulated platform is in there, and now with so many platforms to use as examples, they can start finding common boosts and improvement. I have great faith in them and I’m happy to have JSMESS be a stress test for the next generation of Javascript programming.

Assuming the speed gets shored up, the next problems resolve around the keyboard interaction being subpar, because it turns out the keyboard interrupts for browsers are significant. It’s an entirely fixable problem – someone just has to go in and reset the keys and make a configuration file better suited for being in a browser. I’m sure that’ll happen soon. Obviously consoles are easier than computers – less controls to map.

Next comes the packaging. I’ve given thought to this. Here’s my rough sketch.

I’d like it to be similar to JW PLayer. Not JW Player as it is now, as a business with the demand for your e-mail before you can get a free copy of it, and with commercial licenses. It’s obviously a little Adobe wannabe and I’m not talking about that.

When JW Player first came out, it was a pretty perfect little audio/video player for a website. You got the package, and you could put JW Player right into your website with little effort, with all sorts of little settings for your needs. Need it to just have a single song in there, ready for someone to click? Template for that. Needed it to play an album? Got a template for that. Video player? Got it.

Like I said, JW Player got all business-y and it’s got licenses and a whole bunch of things now, and that’s fine – but the core original approach of giving people tools to add audio and video to their sites, that’s what JSMESS is for computers.

I see a set of templates where you can say “Put up this computer running this software”, or “Put up this computer with a drop-down menu for these software packages”, or “Put up this computer, twice the size, with a set of options selectable along the right side.” People are good at templates. I have faith we can make a set that will work nice on a page.

frontdoor

The nice thing about emulators is they sell themselves, once you see them. When I put up a link to the JSMESS ZX Spectrum emulator, even though I’d been putting up links to the other platforms, a range of people got really excited about it. For them, that was the selling point – a ZX Spectrum in a window. Same to the Sam Coupe crowd when that appeared. For these groups, once they see “Our favorite home computer in an embedded window”, the debate is over and shifts to “when can I put this everywhere?”

As of this writing, the loader for the JSMESS homepage is very simplistic, and meant to be flexible across all the platforms. I think with some effort, elegant and more helpful code sets can be added. I think we’ll revise things and make them easier to use, and we’ll start seeing these things popping up in weblogs and websites in no time, after the distribution of JSMESS 1.0 comes out. It’ll be great.

And the aftermarket! People taking it and adding features, customizing the templates, and finding bugs – they’ll be welcome addition to making it better.

bass

So, one last thought about this, before I turn to the rest of the things I’m working on.

JSMESS, the project, is not the emulator.

MESS is the emulator.

The fine folks behind MESS are working very hard on improving accuracy, making MESS interface with all sorts of situations, and generally trying to make the whole thing as portable as possible. One look at, say, the most recent list of code additions for the most recent update will tell you that.

All JSMESS is, is an attempt to take MESS and port it to another language, in a way that translates as much of the MESS quality and experience to the new language. As it turns out, this language (Javascript) is an open-source implementation that works across most known browsers and therefore most hardware platforms, including mobile, desktop, and dedicated systems.

In doing this, it brings emulation of this sort to a much, much wider audience, makes it accessible to education and communities, and turns computer history into an embeddable object.

It’s powerful, but also limited and simple. It is difficult work to port MESS to Javascript, but it’s a miniscule sliver of the effort put into maintaining MESS itself, to seek out hardware, do the legwork of finding how to emulate a system, and determine the thousand, maybe million little weird aspects of these systems for the use of generations to come. That’s the effort to respect, that’s the people to applaud, before JSMESS gets any of that.

I hope this effort will bring attention to the MAME/MESS projects, the kind of attention where someone, seeing an end-to-end advantage (if they help MESS, then the embedded computer object on their website gets more accurate), gets right in there and gives MESS the deep research and information they need to capture accuracy more.

More than once in this, someone has snickered and gone “yeah, but do they support the [whack-ass peripheral] on [platform]?” And more often than not, of course they have. These people care about this. It has burned some of them out, it has made some of them rage, but they believe in it.

It’s something worth believing in too.

To 1.0!

 


More JSMESS: Little Help Here —

Warning: A little about project management, then a whole lot about compiling.

Three entries in a row about JSMESS! Well, I happen to think it’s one of the most important things I’ve been involved in. I mean, just think of it – vintage computers in a window, like they were a movie or a track of audio. The same way we show people something we’re working on or a piece of music we like, we can already do with a computer or computer experience. That’s magical to me. If it’s not magical to you, I don’t have anything more to discuss with you this round.

appleiic

 

The fundamental magic of an emulator is that it creates a computer inside your computer, one that functions in many ways like the original. It’s never perfect, and the people who seek only the perfection and forget that a wrinkled photograph is better than none at all… well, they’ll never be happy. But an emulator can bring back your childhood, prove a point, answer a question, frustrate and delight. It may not inspire your own children to understand what compelled their parent so much into that world, but it might make them realize how far we’ve come or how the reason scores are at the top of the screen or why people like 8-bit references so much. And as a research tool, well…. being able to summon any artifact with the cross-platform and wide reach of browsers is a bit amazing all it itself.

Last night, while playing around, I realized I had three different computers running in windows of my laptop during testing, all of them pieces of my own past, all running along just fine.

BR_vZlrCAAEnZD7

 

Foe the observant, that’s an Atari 800 running Boulderdash, next to an Apple IIc running Bank Street Writer, and finally the 1980 Atari Dealer Demo. Each has their own story I won’t go into here.

In its current state, the JSMESS emulator can bring up, on your machine, in multiple browsers, hundreds of thousands of software programs that have been painstakingly transferred over the past few decades. It is already a victory were work to stop right here and now.

Obviously, we have a ways to go and the energy is still there to do it.

preppie

prepfield

 

Now, let’s be very clear here – the lion’s share of work on JSMESS is MESS itself, which is a wonder of teamwork and dedication to accuracy and quality. Those folks, of which there are dozens, are doing a bushel of effort to bring in obscure systems, unusual setups, and strange mystical workings of long-forgotten platforms to the world. I talk a lot, but without MESS itself, there would absolutely be no JSMESS.

When I started working on this project, elements inside the MESS developers were not pleased at the idea of a not-as-fast, somewhat strange porting of MESS, especially when some aspects were being lost in the transfer to browsers. Their concerns are valid, so I rush to say that if you see something wrong with how JSMESS does something, chances are very good it’s JSMESS, not MESS. (You should download MESS at their site and check it out – it’s pretty amazing tech.) My hope is that JSMESS will honor MESS by bringing it to many more thousands of people.

bstoad2

bstoadA little bit about project management.

I’m not a talented programmer – I use BASH and sit down and get a pencil out when I do a tiny bit of PERL programming to make something “faster”. My attitude is other people do better work, with more thought, and it’s not for me to add One More Shitty Thing into the world that’s been done better elsewhere. So I write stuff that takes the excellent tools people have made and push them into a specific direction that I need for getting the job done.

This turns out to be how I manage projects as well.

There are a number of javascript emulators of machines out there as we speak – JSMESS is not the only one. Some are very mature, and quite amazing, inspirations for the work JSMESS is doing. For example, this is one amazing Apple II emulator, with a spectacular presentation. For the experience of being able to decide exactly what sort of machine you can emulate, check out this Amiga emulator that lets you configure the hardware and features (click on “config”) before you run anything. I mean, isn’t that amazing work?

What JSMESS has over these others is what MESS has over most single-machine emulators – a sacrifice of exacting recreation and bespoke experience with the ability to emulate many, many machines – hundreds at this point. For example, MESS has started the process of emulating the Ensoniq Mirage, a synthesizer! Why not? It’s a computer, after all. And as JSMESS grows, it will encapsulate more and more technology, maybe even everything that has a set of chips in it. So work done to port MESS to Javascipt (and whatever open cross-platform technology comes after Javascript) is a hell of a lot of bang for your buck.

datalife(By the way, running Disk Drive Utilities in an emulator makes me feel like the program is Neo and I’m the Architect. “Your first question, while the most pertinent, is the least relevant.” “ARE THERE ANY BAD SECTORS ON THIS DISK.”)

So what I’ve done over the past couple of years is work with volunteers to come up with a framework for porting MESS to Javascript. This has resulted in the work with Emscripten, a method for turning compiled code into running Javascript programs. That took a while, but it does work.

Now, let’s get technical. And where I need your help.

cpet

We figured out long ago we simply couldn’t compile all of MESS into a single .js file – it makes every browser explode. Chrome currently has a hardcoded limit of 32,767 variables, and Firefox has a point at which it hits the CTRL-FUCKIT button and drops the mic.

So, instead, we went a different route. Within the MESS system there’s something called a “Tiny Make”, which allows you to build a minimal version of MESS for the purposes of testing. In their case, it generates a version of MESS that only plays the Colecovision. It turns out this “tiny make” is small enough to be converted by Emscripten into a .js file that does everything we want, and which still contains basically all the features of MESS within it. (It even has the internal menu system and configuration!)

Once it was proven that this worked with Colecovision (you can try it out here), it’s “just” been a case of making the rest of the platforms that are properly emulated work. (Here’s a list of all the platforms currently in MESS: 1766 variations of 683 different computers!) Of the 683 platforms, “only” about a hundred are in a ‘good’ emulated status, and of those ‘good’ ones, some are minor variations of a platform, like emulating the Dragon 64 and the Dragon 64+.

So, the high level plan is to make Makefiles for all the “good” emulated platforms (about 100 as of this writing) and put up JSMESS pages for them.

After we have that going, we’ll go back and make keyboard mappings better, presentation better, and so on. We also, in parallel, have the Emscripten gang working to make better and better Javascript speed improvements for the materials (right now speed ranges from 11%-100% of the original platform’s performance). But it all depends on having Makefiles and compiling the systems. That’s our bottleneck right now.

jscmessc64So, what exactly am I talking about here with the Makefiles?

Well, the way that we put together a platform is to compile just the modules needed to make that system run. So, for example, the Colecovision needs the following files compiled to run:

emudummy.c, drivers/coleco.c, machine/coleco.c, plus Z80 Support, plus MCS48 Support.

Once these are all compiled, we have a working Colecovision.

For the Atari 800, here’s what we threw in to make it work.

emudummy.c, machine/6821pia.c, video/atari.c, video/antic.c, video/gtia.c, machine/atari.c, devices/appldriv.c, formats/flopimg.c, formats/fdi_dsk.c, formats/td0.dsk.c, formats/imd_dsk.c, formats/dsk_dsk.c, formats/d88.dsk.c, formats/atari_dsk.c, machine/atarifdc.c, machine/ataricrt.c, machine/atari400.c, plus POKEY support, plus DAC support.

Now, some of these might not be needed, but many are, and when we compile it, we make a .js file that can emulate an Atari 800.

Obviously cleanup is in order down the line, but we’re just in the “get it going” phase, and it’s OK.

Now, our problem is forming the makefiles for the rest of the platforms.

It’s a weird art. I describe it like walking into a rock concert and determining who in the crowd has less than five fingers on a hand. It’s not what MESS was designed for, they do not have a facility to say “just compile this single machine”, (There’s been debates about it for a number of years, but they currently do not do it.)

It’d be great if this was automated somehow, if it was possible to say “show me everything that this emulation of this system needs”. Maybe there is! But we haven’t found it yet.

So instead people are working very hard in the #jsmess channel on EFNet to swim through the code and figure out, as best we can, what’s needed for a given machine. It’s voodoo and strange but the result is emulated computers, so it’s rewarding when it works.

So? Want in?

You’d be doing a lot of good, and if this is the kind of challenge you like, the rewards are immediate and great. Maybe you’ll even figure out a way for us to do it with a lot of scripting help, to make it that much easier. We can dream.

Or maybe MESS, down the line, will add this feature. Who knows.

But I’ve laid it out to you.

Hurrah for emulation!

tihb1

 

 

 

 

 


The Javascript MESS Plays Commodore 64, Commodore PET, and Apple IIc —

061228-Apple_II_80kv6ma35ms36SFD-Cut EdgesGeez, it seems only yesterday we were talking about emulating the Atari 800… oh wait, it was yesterday. Well, thanks to Vitorio working hard with the Makefiles, we now have emulation using JSMESS of the Apple IIc, the Commodore C64, and the Commodore PET.

So, with the caveat that we still have along way to go with regards to keyboard mappings, profiling to find improvements of speed, and a bunch of other issues, I can now say that it is possible to interact and appreciate well over 200,000 programs in your browser, with no plug-ins.

The best part is that there’s no sign of stopping, other than the fact that our volunteer force needs time to do their actual lives, and can’t be spending too much time working out makefiles and other stuff. If playing with makefiles to help us bring another few dozen platforms online interests you, stop on by: #jsmess on EFNet.

c64_10

Emulating specific systems in JavaScript is nothing new, and some of them are very mature and very well-developed. The big difference here is that by relying on the MESS developers, who are active, talented, and committed to emulating as much as possible, we can focus on a small amount of issues regarding browser compatibility, and in doing so, increase access to this vintage software and programming hardware.

Much like MESS and MAME itself, the approach of JSMESS is to use the power of shared code and a approach towards all platforms to ensure the widest amount of support. Platforms that otherwise would have no advocates are showing up in this collection of emulation. Over time, support for various plug-ins, cards, and peripherals ensure that we can take a reasonable shot at capturing a lot of vintage computer history.

As you can imagine, there is a bit of debate within archiving and preservation communities as to the viability of utilizing emulation over original hardware. But everyone agrees, in 50 years we will have a smaller percentage of the equipment we do now, and emulation is very likely the way to go. The fact that it is possible to bring this emulator up in a window in a phone or iPad or home computer or laptop, ensures that a much wider audience could learn what you’re talking about when you mention some sort of program. It also concentrates a lot of interested effort in emulating items to branches of code that the widest audience will see. If you contribute code to MESS, it will now potentially end up in a browser, capturing a potentially lost aspect of computing. That’s priceless.

jssmurf3

As we get more platforms up and running through the pipeline, attention turns to speed and usability. Right now, JSMESS is a test program, providing a cute trick for you to enjoy checking out old programs. But it’s far from the level of usable that the original MESS has. The hope is that over the coming months we will see improvements in all aspects of this program, turning it into a cross-platform browser experience that will, as I hoped, change computer history for every all of time.

Now go screw around.