Just in time for the tenth anniversary of the September 11th attacks, Greenpoint resident Brian August will launch the new (and free) smartphone app 110 Stories, which uses augmented reality technology to render the outlines of the World Trade Center towers when users point their devices at Lower Manhattan. Users can then post the memories conjured by the image of those iconic buildings to 110stories.com. We talked to August about the project’s origins and how it will work.
When did you first get the idea for 110 Stories?
Last summer. At that point I lived on the corner of Keap and Hope and on my rooftop I took a piece of copper tubing and bent it up into the shape of the towers. I’d already spent so much time staring at the horizon from my roof that I knew exactly where they were, and I knew exactly how big they should be. So I took this piece of copper tubing that was maybe three feet high and handed to my friend and said “go stand over there and hold that, hold it up like a flag,” and I positioned myself 20 feet back with my iPhone to get the right perspective, because you’ve gotta be in the right spot to make it work, and I snapped the picture with my iPhone, just thinking, “Let’s see how this comes out.” And we both looked at it and were like, “Holy shit, it looks like the skyline.”
So do the towers in the app look like they’re made of copper wire?
It doesn’t look like copper wire, but we kept its imperfections. It inspired me to make the rendering somehow look humanized. You look at it, and you know the buildings aren’t real because it’s either a black line or a white line, but they’re so accurate and in the right spot that you feel the same way you would feel if they were rendered perfectly. You feel actually better in a way because you know they’re not real.
In many respects 110 Stories seems to be more about the stories people will post rather than the buildings’ silhouettes.
The image is very simple, it’s to cajole you, to pique your memory and make you go, “Whoa, I forgot about that.” It has a lot of different implications for different people, and I think that’s really healthy.
What’s the geographical range of the app?
I would like the range of the app to be almost out to the absolute physical distance that you could see them, which I think was 49.7 miles at sea level on a perfectly clear day. My bigger worry is how it’s going to work when you’re two blocks away. That’s where it’s going to get clunky just because of the buildings and the whole perspective gets skewed. You see pictures from underneath where you get all those converging lines going up looking like train tracks. That’s gonna be much harder to render than it will be from three miles away, where it’s just perfect against the skyline.