ASCII by Jason Scott

Jason Scott's Weblog

The Mirrorball —

I’m probably going to regret posting this, but I deal mostly in regret and nostalgia daily, so it’ll all be just fine.

The Thing is now everywhere online. It’s absorbing everyone, every place people begin to talk about any aspect of people. I can’t think of a social interaction space of any part where it’s currently not in full rage. But I am not interested in The Thing. I’m interested in a specific vector of The Thing, which is (as far as I know) not quite explored, and appears to be a syndrome of the chicken-coop world the “online-ing of everything” is causing.

I am not using any specific words because that is how The Thing gets in and it is less about The Thing than what The Thing happens to be finding in the weakness in human beings.

Back in my heady days of Wikipedia Critic At Large, I was also fascinated with how Wikipedia was a dark mirror that held up to us the very fundamental issues of our souls. How an attempt to create a politics and drama-free zone for information retrieval simply became a vacuum that was absorbed by the warring factions of Inclusionism and Deletionism, a place where agendas over stories that were at utter odds with each other were expected to live in the same space, and where the wonkified and self-absorbed of petty tyrants could run free as long as they spoke just the right language and used the right phrases to describe their extension of power. This was all activity that had been in effect for all of civilization, with some industrialization of that effort in the 20th Century – it was just out in the open and being provided an intense laboratory environment in Wikipedia.

To describe everything Wikipedia has shown us in terms of information manipulation, techniques for controlling the message, and the power of persistence over truth or knowledge… that’d be a book. A book I’m just not going to write, thanks very much.

But outside of a few angry forums and a couple excellent articles, Wikipedia’s situation was confined largely to Wikipedia itself – if you step in there and walk the hallways of the talk pages or read the mailing lists run by the Wikipedia organization, it’s all there and affecting people in that space but not, say, every weblog, every Facebook, every comments section, every moment of online life.

Not so with The Thing.

If I appear to be talking about this all in a very strange manner, it’s because The Thing is so virulent, so intensely catastrophic in terms of interaction with it, that the chances of it blowing up in your face are beyond extreme. But let’s try anyway.

When you point at the moon, a person will look at the moon, and a dog will look at the pointing finger. Please be a person in this.

Here is my theory.

  • Generally, as a person, you do not like to see the oppressed oppressed.
  • Generally, as a person, you have a limit at which you feel you must speak.
  • Online life lowers this limit below the floor, meaning all things need you.
  • Online life also conflates issues and people, equating them in language.
  • Issues tend to actually be groups of issues appearing, from the distance of outside circumstance, to be the same big issue.

And then this happens.

  • If you witness a group of issues, some relevant and some irrelevant, you will feel a need to respond to the relevant issues.
  • If you witness someone responding to the relevant issues of the group, you will think they are responding to the group of issues.

The exploit, taken from this, is then clear: Group up a bunch of issues (Apples, Oranges, Watermelons, Rat Poison) and then people who show concern about the Rat Poison will seem to be showing concern on Apples and Oranges. And people who are supporting Apples and Oranges could easily be mistaken for supporters of Rat Poison.

Once that spark hits the kindling, it’s over. Human foible, language, and attention to detail that can only come from the online world, combined with the reaction time of milliseconds, a single click, ensure the fire can burn forever, consuming all in its path.

Like I said, I thought of Wikipedia as Humanity’s Dark Mirror. I think this new Thing is a mirrorball, a globe of facets that are bound to reflect what you are looking for in your direction, and aim a distorted, skewed image of a hundred other things that will catch your eye as well, and beg you to do something about them.

Once in, you’re in. Maybe I’ll be in now. I hope not – I’m really busy as it is.

Here’s where, having framed the situation as best I could, I propose a solution.

I have no solution.

I am stumped.

I loved my little world on the telephone as a teenager – I saw possibilities and dreams far beyond my youthful sadness and isolation. I wished a world where every moment, I could reach out to other people, talk into the day and the night. I wished that I would never again know the silence of my own heartbeat and fears of a life ending without having known all that could be known. I wanted connection and interconnection, worldwide.

I appear to have gotten that.

I am afraid.


The 8-Bit Generation Reboot —

Back in 2012, I wrote an entry called 8-Bit Generation: Missing in Action, in which I described and lamented a seemingly lost and impeccably filmed documentary called The 8-Bit Generation.

logo-nero

And on September 13th, 2014, on my 44th birthday, I was able to show an audience footage from the series, along with interviewing the producer of the film, Bruno Grampa, who had been flown from Milan for the occasion (the XOXO Festival in Portland, OR). I also had extensive conversation with the creators both in e-mail and in person at the event.

Here is VERY ROUGH AND GENERAL INFORMATION that I will pass along to you, and which will get more details in the next few months.

  • The work is real – there are 55+ interviews with a whole range of amazing characters from the story of 8-bit computers and videogames.
  • Some of these people, like Stan Veit and Jack Tramiel, are no longer with us.
  • They have major amounts of the work edited, and some of these sequences were shown at XOXO, showing how they tell the story about early computer technology and the industry.
  • The sound and video is impeccable, just like the trailers. Zoe Blade did a lot of the music.
  • The company/production house that did the filming and editing went out of business, more than a dozen people lost jobs.
  • The rough plan is to do a crowdsource raising of funds, finish a full-length documentary, and then hopefully go from there to make a series.

Now, I’m not involved in the production or creative direction – I’m just someone from the sidelines who saw a chance to help a production in need. Still, I’m going to help where I can.

People paid in money and got nothing – including me. Rest assured, nobody is living high off that money. Folks went basically broke. Now, with support and a recognition of the unique material, they are going to try and bring it back for release.

I’m sure there are questions, demands, anger, delight. I expect the production team to provide more information in coming weeks and months. I’ve connected them with some potential good folks to help with some of the things they need. I’ll be there as best as one can be.

However it goes, the effort is being made. Nobody wins if this thing disappears. Here’s hoping.

200144-header

 


Statement by Jason Scott on Archive Team —

archiveteamlogo

In 2009, I proposed the idea that rapidly-shutting-down websites should be quickly downloaded and saved by a fast acting and future-minded group of web archivists. I called them the A-Team, or Archive Team, and said they’d be the last resort for online history that was otherwise doomed. An unbelievably huge amount of web history had been deleted by that point, subject to capricious whims of startups and dot-com tomfoolery that resulted in quick shut-offs.

A good number of people had this proposal resonate with them, and they volunteered time, money and effort to making this team a reality.

In the interim years, I have done dozens of appearances, interviews and statements regarding the goals and ideas of Archive Team. With my skills in public speaking and a willingness to stand in front of crowds, I’ve been the face of Archive Team to many people. I’m still involved with the project daily, and provide advice, make connections, and do whatever else I can to help with the goals of the Team.

BXY-RPOCMAImCm9In 2011, on the strength of a talk I presented at the Personal Digital Archiving Conference, I was hired by the Internet Archive as a free-range archivist, working with the non-profit on a huge realm of projects. Some of these have been internal efforts and some have been widely public, like the collection of software or preservation of various musical archives.

All of this brings to bear a very important distinction:

Archive Team is not an Internet Archive Project, and I am not the owner of Archive Team.

Archive Team has no offices, no phone number, and a website that is a wiki that any reasonable person or team member can edit. While I am a (generally) beloved figure who is appreciated for his public speaking skills and snappy dressing, Archive Team has collectively disagreed with me and some projects have been approached completely different ways than I would have approached them.

The Internet Archive does NOT take every single bit of data Archive Team produces (although they take the vast, vast majority) and Archive Team is not “paid for” or “owned by” the Internet Archive in any way.

As the saying goes, if I was hit by a bus later this afternoon, Archive Team would be very sad, but would get back to work by the evening. The early evening.

While I am pleased that the Team listens to my opinion as much as it does, there have been projects and efforts that were started, refined, and producing output long before I wandered into the communication channel. While my presence is then acknowledged, the Team has continued the given effort as it would have anyway, preserving and rescuing online history from total oblivion for the good of society and future generations.

Archive Team is an idea, and the idea is far beyond me or even the Internet Archive at this point. I expect this to continue.

Meanwhile, I have been actively avoiding the front of buses.

Thanks.


OASIS at SXSW: An Asking Thing —

Every time I’ve spoken at the South by Southwest Festival in Austin, Texas, I’ve always had someone else arranging all the details, including the panel and the attendant responsibilities of getting it to pass muster. This time, however, I’ve put together something neat, and the process needs, nay, requires you to really get out there and pound the pavement.

I’m a busy person and endless pavement pounding is not my usual deal, but if you are so inclined, and since the deadline to vote ends within a week, I’d like to call your attention to the SXSW “Panel Picker” and a proposed on-stage conversation to be held between two very odd retro-oriented folks.

Here’s the full information:

http://panelpicker.sxsw.com/vote/38216

In Ernest Cline’s novel READY PLAYER ONE, the world’s population spends most of its time inside the OASIS, a simulator and ultimate operating system that provides access to an endless amount of games, videos and media to the world. At the Internet Archive, a non-profit dedicated to open access to as much content as possible, a new in-browser interface allows instant access to tens of thousands of microcomputer programs, console games, and emulated vintage hardware. Author Ernest Cline and Internet Archive’s software curator Jason Scott discuss the similarities between OASIS and the Archive, the consequences and results of a world with endless vintage computing access, and what parts of the 1980s they’re working the hardest to save.

It’ll help if you have read Ready Player One, so there’s Ernest Cline’s page about it. It is a fun novel about a world where, among other things, endless amounts of old videogames are available.

It’ll also help if you are aware of the Console Living Room, a project I’ve spearheaded where, among other things, endless amounts of old videogames are available.

The SXSW system made me go through an enormous amount of hoops to get into that panel picker. They have a very large amount of things you have to read and describe, and it appears that having a panel of a guy from Austin (Ernest) and a guy from NY (Me) throws us a few rungs back, but my hope is the surreal experience of a book coming true and what that means for the world at large, will overcome that.

The deadline to vote is at midnight tonight. I couldn’t bring myself to do the massive amount of canvasing to beg for votes that I think is being encouraged. So my hope is you will vote if you want to, and if not, we’ll figure something out. Ernest and I have a mutual admiration and I look forward to doing stuff with him in the future.

Anyway, vote before Midnight, or leave a comment, or whatever floats your 1980s-loving boat. Thanks.


The Need for JSMESS Speed —

This is another call for help with JSMESS. I promise you that I will get back to more general computer history soon, but this project is really important and changing the entire paradigm of how software is presented is pretty high up the list right now.

I also know this description of the issue and then calling out for help can get pretty repetitive, but the combination of javascript conversion, browser interaction, and the entire MESS/MAME project itself means there’s all sorts of strangeness happening in the gaps between.

Let’s focus on the good part first… This program works very well. Almost miraculously, it will run a whole variety of software, game cartridges, and images and present them inside of your browser. Sometimes it’s a bit rough, sometimes it’s a bit slow, sometimes the mere overwhelming user interface between the original item and it being in a browser window makes things strange. But even a few years in, I will set up a full screen image of a game console playing a scrolling classic and I will completely forget how this is happening. It is seriously the bomb.

logo

A while ago I put out a call to have the sound issue looked at. A top-notch developer named Katelyn Gadd stepped forward, helped us create an entirely new sound device, and in doing so fixed about 20 major problems with sound. She also gave all of us a master class in understanding what the boundaries and hurdles are in browser sound in general. Summary: lots, although they are working to change standards to make it better.

The sound situation and resolution was amazing enough to inspire me to try it again. This time, it is speed.

There are a variety of attacks to making the JSMESS system run faster in the browser.

Obviously, it helps if the Emscripten language compiler gets more efficient, and work is being done in that direction. Just a year ago (can it have really been just a year?), The Colecovision emulation was working at 14% speed. Now it almost always consistently runs at 100% on even slower systems. Work on this is ongoing, and the Emscripten development team stays in almost constant communication with us, so that’s being handled.

Obviously hardware will get better over time, but we’re not exactly going to sit back and wait. But stay on point, computer industry!

The browsers themselves are rapidly increasing the speed of their JavaScript engines. The website arewefastyet.com lets you watch nightly tests being run against JavaScript engines so that we may notice that these things are getting damn fast. Again, not my department, not willing to wait.

Certainly, the emulator itself has been working to speed things up, but it might not surprise you to learn that speed is willingly sacrificed in the name of accuracy, to make sure that all the aspects of incoming images are handled and that everything can be, if not future-proof, at least future-existent. If it slows things down for a while, the MAME/MESS team is not bothered by it. It would be nice if somebody went to work on the emulation team itself to optimize things and generally help track stuff down, but that’s a rant for a future entry. Until then, speedups and slowdowns on the emulation can have a pretty drastic effect on the JavaScript version as well.

So that leaves a number of efforts to make the resulting JavaScript output as machine friendly and fast as possible. It also means a situation where simple code changes applied to the emulator source code results in the JavaScript versus being that much more efficient.

To help jumpstart things, we have created a page about the need for speed. We’re trying to put, in terms that will be of use to a developer or coder, what exactly were looking for.

If you’ve got the skills to get involved with this, or know someone who does, it would be great to hear from you. It would have an amazing effect on a pretty important project, and we’ve seen cases where one or two simple insights from a new team member makes the entire thing run that much better.

We really come along. There’s a ways to go, and I’m hoping that by writing this we can reach out to someone to make a difference.

Let’s speed this thing up.

screenshot_07


Screenshots Forever and Ever Until You Can’t Stand it —

The Screen Shotgun, as mentioned before, is continuing its never-ending quest to play tens of thousands of games and programs, take screenshots, and upload them into the Internet Archive.

Like any of these tinker-y projects, I’ve written a bunch of additional support, error checking, and what have you to the process, so that it can handle weird situations while being left alone for days on end. There’s still the occasional mess-up but the whole thing can be viewed later and I can pinpoint mistakes and re-do them with little effort. It’s a win all around.

There’s now a routine called BOUNDARYISSUES that looks at any emulated program and figures out where the edges are – it’s no big deal and the routine is probably hugely inefficient but it’s nice to keep my hands on the programming side of things, even a little. Thanks to BOUNDARYISSUES some machines that have less than two dozen known software packages are getting screenshots, since the program will do the cropping work and it’s not reliant on my procrastination or free time.

And how many winners there are!

screenshot_25

There won’t be an endgame to this anytime soon – I’m now ingesting hundreds of floppies, thousands of already-ingested floppies, and whatever else I can find online. The Screen Shotgun has work cut out for it for some time to come.

So thanks to the industrialization of the screenshot, it’s giveaway time!

I’ve decided to throw some galleries of these screenshots on Flickr, because what the hell, I have an unlimited account and I love finding what the definition of “unlimited” is. So, enjoy:

Feel free to use these any way you want, for whatever you want. Watermark them and I’ll track you down and humiliate you like a hole in your pants. Make art! Do criticism! Set up retro slideshows in your raver club! This art represents hundreds of thousands of work by thousands of people – it’s worth browsing through. (ZX and Atari 800 are my favorites at the moment.) I’ll be adding more sets soon.

So yeah, I’m writing this one in the ‘success” column. This is years of work, done in a month. But so much more to do!

Get going, shotgun.

00_coverscreenshot (1) 00_coverscreenshot (2) 00_coverscreenshot (3) 00_coverscreenshot (5) 00_coverscreenshot (6) 00_coverscreenshot (7)


WHAT the Cloud? —

In some circles, I’m known as the guy who wrote Fuck the Cloud.

Yet as of this past weekend, I have three Amazon EC2 instances doing massive amounts of screenshots of ZX Spectrum programs (thousands so far) using the Screen Shotgun.

Nobody has specifically come after me about this, but I figured I’d get out ahead of it, and again re-iterate what I meant about Fuck The Cloud, since the lesson is still quite relevant.

00_coverscreenshot (12)

So, the task of Screen Shotgunning still takes some amount of Real Time – that is, an emulator is run in a headless Firefox program, the resulting output is captured and analyzed a bit, and then the resulting unique images are shoved into the entry on archive.org so that you get a really nice preview of whatever this floppy or cartridge has on it. That process, which really works best once per machine, will take some amount of minutes, and multiply it by the tens of thousands of floppies I intend to do this against, and letting it run on a spare machine (or even two) is not going to fly. I need a screenshot army, a pile of machines to do this task at the same time, and then get those things up into the collections ASAP.

A perfectly reproducible, time-consuming task that can be broken into discrete chunks. In other words, just the sort of task perfect for….

Well, let’s hold up there.

IMG_3701

So, one thread or realm of developer/programmer/bystander would say “Put it in the Cloud!” and this was the original thing I was railing about. Saying “Put it in the Cloud” should be about as meaningful a statement as “computerize it” or “push it digital”. The concept of “The Cloud” was, when I wrote my original essay, so very destroyed by anyone who wanted to make some bucks jumping on coat-tails, that to say “The Cloud” was ultimately meaningless. You needed the step after that move to really start discussing anything relevant.

The fundamental issue for me, you see, is pledging obfuscation and smoke as valid aspects of a computing process. To get people away from understanding exactly what’s going on, down there, and to pledge this as a virtue. That’s not how all this should work. Even if you don’t want to necessarily be the one switching out spark plugs or filling the tank, you’re a better person if you know why those things happen and what they do. A teacher in my past, in science, spent a significant amount of time in our class describing every single aspect of a V-8 engine, because he said science was at work there, and while only a small percentage of us may go into laboratories and rockets, we’d all likely end up with a car. He was damn right.

Hiding things leads to corruption. It leads to shortcuts. It starts to be that someone is telling you all is well and then all the wheels falling off at 6am on a Sunday. And then you won’t know where the wheels even were. Or that there were wheels. That is what I rail against. “The Cloud” has come to literally mean anything people want.

No, what I wanted was a bunch of machines I could call up and rent by the hour or day and do screenshots on.

And I got them.

samurai

Utilizing Amazon’s EC2 (Elastic Computing) is actually pretty simple, and there’s an awful lot of knobs and levers you can mess with. They don’t tell you what else is sharing your hardware, of course, but they’re upfront about what datacenter the machines are in, what sort of hardware is in use, and all manner of reporting on the machine’s performance. It took me less than an hour to get a pretty good grip on what “machines” were available, and what it would cost.

I started with their free tier, i.e. a clever “try before you buy” level of machine, but running an X framebuffer and an instance of Firefox and then making THAT run a massive javascript emulator was just a little too much for the thing. I then went the other way and went for a pretty powerful box (the c3.2xlarge is the type) and found it ran my stuff extremely well – in fact, compared to the machine I was using to do screenshots, it halved the time necessary to get the images. Nice.

You pay by the “machine hour” for these, and I was using a machine that cost $.47 an hour. Within a day, you’re talking $10. Not a lot of money, but that would add up. The per-hour cost also helped me in another way – it made me hunt down inefficiencies. I realized that uploading directly to archive.org was slowing things down – it had to wait in line for the inbox. Shoving things into a file folder on a machine I had inside the Internet Archive was much faster, since it just ran the file transfer and was able to go to the next screenshot. Out of the 2 minute time per program, the file upload was actually completely negligible – maybe 1-2 seconds of uploading and done, versus 1-2 minutes putting it carefully into an item. Efficiency!

I then tried to find the least expensive machine that still did the work. After some experimentation (during which I could “transfer the soul” of my machine to another version), I found that c3.large did the job just fine – at $0.12/hr, a major savings. That’s what has it for now.

00_coverscreenshot (11)

Because I knew what I was dealing with, that is, a machine that was actually software to imitate a machine that was itself inside an even larger machine and that machine inside a datacenter somewhere in California… I could make smarter choices.

The script to “add all the stuff” my screen shotgun needs sits on a machine that I completely control at the Internet Archive. The screenshots that the program takes are immediately uploaded away from the “virtual” Amazon machine, so a sudden server loss will have very little effect on the work. And everything is designed so that it’s aware other “instances” are adding screenshots – if a screenshot already exists for a package, the shotgun will move immediately to the next one. This means I can have multiple machines gnaw on a 9,000 item collection (from different ends and in the middle) like little piranhas and the job will get done that much quicker.

In other windows, as I type this, I see new screenshots being added every 20 seconds to the Archive. That’s very nice. And the total cost for this is currently 36 cents every hour, at which point a thousand screengrabs might be handled.

I’m not “leveraging the power of the cloud”. I’m using some available computer rental time to get my shit done, a process that has existed since the first days of mainframes, when Digital and IBM would lease out processing time on machines they sold to bigger customers, in return for a price break.

It is not new.

But it does rule.

screenshot_01


The JSMESS Sound Emergency —

UPDATE: I’m happy to say a developer has come forward and we’re out of the woods on sound. It’s not perfect, but the web audio API isn’t perfect and we’re much better armed for interacting with it now. Thanks, everyone.

logo

Spread this one far and wide.

It’s rare I get anything close to desperate, but we’re somewhere in the realm of “stunningly frustrated” and so I can see where things are going. I can state the problem simply, and hopefully you or someone who you can reach out to, can be the one to do the (likely minor) work to make this happen.

Essentially, JSMESS has a real sound issue.

MESS does not – the program handles sound nicely and stuff sounds really great, just like its recreation of computers and other features are great. In most cases, these amazing MESS features have translated nicely into JSMESS. But not sound.

IMG_2866

I have thrown a lot of good people at this morass. We’ve done a massive amount of work trying to get sound to improve. We have cases where it is very nice, and cases where it is horrible, grating.

It is holding back the project, now. People want to hear the sound. Right now, it is simply not dependable to turn on at the Internet Archive. I want to be able to turn it on.

Like a lot of problems to solve with the web, we have two test cases you can try out: The Wizard test and the Criminal test.

Here is the Wizard Test. It’s an emulator playing the Psygnosis game “Wiz n’ Liz” on an emulated Sega Genesis. This is extremely tough on the browser – almost nothing can play it at 100% speed.

Here is the Criminal Test. It is an emulator playing Michael Jackson’s Smooth Criminal as rendered on a Colecovision. It is not tough on the browser at all. Almost everything should be able to do it at 100% or basically 100%.

In both cases, Firefox will play the emulator faster and will sound better. Chrome will generally do well, but will be slower. Internet Explorer will have zero sound. Safari… well, depends on the day. (And Opera is dead – it’s essentially a reskinned Chrome. As is Seamonkey a rebuilt Firefox.)

colecojackson

So, what do we know?

Well, part of this whole mess was a switch over to the Web Audio API. Mozilla’s browser had a nice format before that worked well, but only on Firefox. In theory, the new API will eventually work everywhere.

Here is a helpful chart describing that compatibility. So we’re working for this Web Audio API.

My belief is only a relatively small number of people will be able to help. I am happy to entertain all ideas, discuss all possibilities. You can come to #jsmess on EFNet if you have IRC, or you can just e-mail me at audio@textfiles.com. I am willing to spend all the time you need to ramp up, or try any suggestion.

In the past, fresh eyes have helped us greatly to get MESS to the fantastic position it is now, where it can play tens of thousands of programs for hundreds of platforms. Here’s hoping your fresh eyes might help us further.

 


A Very Big Sort, or The Epic of Deaccessioning —

This has been years in the making.

IMG_7266

When my living space looked like this photo. it was just vaguely problematic. Considering what I do and what is specifically needed day to day, this project and storage situation was an issue.

But that was a while ago. Now we’re at this:

IMG_4977

See, that’s seriously out of control.

There’s two main contributing factors to that state, which is that I had to quickly consolidate from other parts of the house I’m renting when space was needed for other items, and I simply did not have time to address incoming material when it came in, so it became a matter of just finding space for things mailed in and then saving it for later.

This is to say nothing of the shipping container, which currently looks like this:

IMG_2693

So, that’s a lot of stuff. That’s a 40-year old’s ability to acquire material and bring it to bear into a storage space, with a splash of “divorced in 30s” and “moved out of the house”.

But here comes the big changes.

The book scanner is back up again. That provides me the ability to scan in materials before finding them a home elsewhere. My rule is nothing leaves the house in printed form unless it’s digitized in some fashion, and I have a copy of the digitization.

With the books scanned and going to a home, that then leaves magazines.

Magazines scanned and going to a home, that then leaves academic papers and proceedings.

Believe it or not, just getting those out will probably clear things up beyond belief – I’d say a MAJORITY of material in the container and my room are printed materials of this sort.

As I begin going through the books, I check for them on the Internet Archive’s Open Library site. I see if the book has been scanned already, and it’s quite shocking just how much the Archive has scanned over the past few years, then I know I don’t have to scan it. I spend a small portion of the time that I would have spent scanning adding some background information to the Open Library entry, so that the book has a better look. I also ensure the cover image looks good, and so on. Then it’ll go into a box of outgoing material.

With that going on, I’ll getting a pile of to-be-scanned books and an outgoing pile to be sent away. That brings up the next issue.

Where do they go?

IMG_0758

I do not throw out books. Let’s repeat that – I do not throw out books. I go a little further, in fact, and I will not give the books to a place likely to throw out the books. I consider that up with the cowardly act of leaving your longtime pet with the vet to be put down, while you tearfully drive home. You did the deed, you just didn’t own up to taking responsibility. So I’m not donating/contributing these books to a place that is likely to toss them.

This cuts down the potential field dramatically.

For books related to games and gaming, a home is already in place – The International Center for Electronic Games at the Strong Museum, in Rochester, NY. I’ve built up a great reputation with those folks, and they are delighted to be getting that set of material. They don’t mess around. They get things done:

IMG_1017IMG_1054IMG_1056

One of the reasons I really like them is that they have an honest research library and space you can work in – you can get a hotel nearby, and go in and do actual work involving having a table and even a locker to store the printed materials in between days. It’s what I really want for this.

Because, you see, it’s not about me having the most stuff. It really never was. I am not interested in going after as many lost collections as possible, pushing them into a bigger pile, and declaring victory. I want this stuff accessible and useful.

About 4-5 times in the last two years, people have approached me asking if I have X, and the answer, in 3 of them, is essentially that I do have X, but X is buried way the fuck down in the shipping container, and good goddamned luck. Well, that’s not right at all.

So off they’ll go.

The question that remains is where.

I’ll be packing these scanned-or-verified books into boxes, putting them in bags before doing so, and then looking for a place these will go.

The place will have to have a phone number, people on salary, and physical space dedicated to storing and accessing the materials.

I want to talk to them and I will probably want to tour them.

So that search begins – here’s hoping I find one.

In the meantime, now begins what I hope is the next phase – slimming down my collection while making it available to the maximum amount of people.

To the Scanner!

IMG_5080


MindCandy: The Last Bright Star Before the Media Dims —

unnamed

The MindCandy series got me started on this whole “get it down in a movie” trip.

Created with a love of the demoscene, a dedication to capturing the demos as accurately as possible, and most importantly explaining the entire process from beginning to end, MindCandy was a refreshing breath of air. DVDs, which were still relatively new in 2002, had a few weird examples of using the features in the format but few had the dedication to making the most of the format as MindCandy did.

As I began work on the BBS Documentary, it was MindCandy’s inspiration (and their staff) which gave me the push I needed to make the final DVD as nice as it could be.

MindCandy Volume 1: PC Demos was followed up a few years later by MindCandy Volume 2: Amiga Demos. It was in every way as good as the first. They sold pretty well – they made back the cost but they’ll never make back the time spent to make them.

Then, finally, they released MindCandy Volume 3. MindCandy was a Blu-Ray/DVD combo, and it did its best to use all the insane measures of Blu-Ray and bring the high-resolution captures of PC Demos to the next generation of media equipment. It is truly an exquisite package, of a near and dear quality.

But of course, times had changed.

vol3xThe trick of moving from CDs to DVDs, and from DVDs to Blu-Ray, has turned out to be a cul-de-sac in the journey of access to material. I saw this in 2010, and released GET LAMP with a gold coin because I knew it was going to be difficult to get people to buy physical media. By 2010, people were asking “Why can’t I download this”? And by 2011, people were asking “Why do I have to download this? Can’t I just stream it? Everywhere?” This world changed very fast.

Yes, there are still people who prefer the physical media. They want a nice package, a sense of an experience when they get the show in their mail. They are a shrinking group, and while they should be catered to, they are out of the realm of the majority of people. Some even think they’re part of this group and they seriously are not. Not really.

It’s pretty obvious where the world is heading, and so this graphical treat by Hornet (who designed the DVD and software to do amazing captures that are still used by the Demoscene) is the bright brilliant sunset of a spectacular triptych of works.

The model these guys should have gone with should have been Patreon (make top-quality exports and contextual interviews about demos, and release a set for money each month), but they didn’t have Patreon until recently, so here we are. A missed boat.

MindCandy 1 and MindCandy 2 sold out of their DVD media years ago. In response, MindCandy has released both of these products as Creative Commons-licensed downloads. You can grab them both from the site.

MC3_Cover1Complete_1280_x_709

And now the last volume has been given a viking funeral, with the remaining stock being dropped to $12. I’m sure they’re taking a bath on this. The announcement said they made 2,500 copies and this was the last 700. Since this came out three years ago (2011), that’s slow sales and I’m sure this was a huge expense.

So, my learned advice to you is this: buy this artifact, this excellent work and package as it rounds out a short but sweet arc of physical media meant to be the next generation.

Oh, and it’s top-notch.