Brown Institute showcase features tour guide drones, defense contract database September 30, 2016 0 Comments Share tweet Fangzhou Liu Managing Editor of News By: Fangzhou Liu | Managing Editor of News From drones acting as tour guides to mass-analyzed cookie recipes, the projects on display at the Brown Institute’s first ever Media Innovation Showcase were anything but conventional. Funded by Brown’s yearlong Magic Grants, interdisciplinary teams from Stanford and Columbia pioneered new techniques for every step of the storytelling process, from research tools for journalists to new ways of telling a story. The Brown Institute was established in 2012 as a new collaboration between Stanford University and Columbia University’s School of Journalism. Longtime Cosmopolitan editor-in-chief Helen Gurley Brown endowed the institute so that students at her late husband David Brown’s alma maters — Stanford (’36) and Columbia — might work on new endeavors in media innovation. Stanford Brown director Maneesh Agrawala said in his opening remarks: “Tonight, I am super excited to present the projects we supported over the past year. They feature lots of new technologies, new storytelling techniques, as well as new and interesting stories.” G:Drone: the first companion drone The team behind G:Drone envisioned a tour guide drone that went beyond surveilling human activity from above, getting up close and personal with users’ interests. Jessica Cauchard Ph.D. ’13 explained that the project initially grew out of her team’s interest in drone cinematography. The project rapidly became a novel exploration of drone-human interaction, spanning questions in the social sciences and computer vision as well as human-computer interaction. Cauchard said, “This is the only drone system I know of that does both input and output.” The G:Drone hovers close to its users as they tour a foreign place, projecting a user interface on the surrounding pavement. Users are able to read maps, see interior views of buildings and read background information on the sights by simply pointing an arm at part of the interface, as though they had a larger-than-life touchscreen. Piloting a drone is considered to be enough of an uphill task, let alone all the extra work of graphics and interacting with human beings that challenge even computers on the ground. Cauchard explained that the G:Drone’s petite frame has everything it needs to do its job, without the help of external computers. “We embed a projection camera system on the drone, so there’s no computer controlling the system. The 3D sensor and pico-projector are all on the drone itself.” Cauchard’s team hopes to develop the G:Drone to help journalists document inaccessible or unmapped areas, such as Syrian cities destroyed by the ongoing civil war. Recipe Mining: What’s in a cookie? A “deconstructed cookie” might sound more like an entry on a fine dining menu than a computer science project, but Brown Fellow Juho Kim M.S. ’10 found the perfect natural language dataset in the glut of chocolate chip cookie recipes on the Internet. Using natural language processing techniques, Kim distilled hundreds of thousands of cookie recipes into their simplest form to uncover the “median chocolate chip cookie.” “My question was, ‘what makes a cookie a cookie?’” Kim quipped. In one sense, every chocolate cookie recipe is a set of standard processes and ingredients. Yet each formulation also varies just enough from the rest to be storied in its own right, so that someone somewhere swears by it as the perfect chocolate chip cookie. Kim’s visualization aims to capture both the essentials of every cookie recipe and the differences that make each iteration of a basic step unique. Kim eventually hopes to extend his current project to sets of simultaneous baking videos “to capture the practice and culture behind cooking” — his own form of storytelling. Open Contractors visualize defense contracts Through their Open Contractors project, Allison McCartney M.A. ’15 and Alex Gonçalvez strove to make famously opaque defense contract data accessible for journalists. The user interface they created lists of every Department of Defense contract from a comprehensive table from a few years ago, broken down by contractor and contracting agency. Users can search for specific companies’ defense contract records and find their parent companies, subsidiaries and aliases with the click of a button. With a quip at the Packard building where the showcase was held, McCartney said, “You can look up everything from nuclear bombs to Hewlett Packard — or search for every contract related to Hurricane Katrina, for instance.” But if Open Contractors’ user interface is maximally intuitive, the original data certainly was not, Gonçalvez said. He explained that the single table they started out with had to be broken down into roughly 100 new tables to make more sense of the numbers. The bulk of their work went into providing the context and consistency to give users real sense of the correlations behind the figures. “Most importantly, we have a plain-text explanation of what’s going on with each contract,” McCartney added to cheers from the audience. Contact Fangzhou Liu at fzliu96 ‘at’ stanford.edu. Brown Institute brown institute for media inno Columbia University Computer Science Department data science David and Helen Gurley Brown Institute for Media Innovation defense contracts Department of Defense drone technology drones journalism natural language processing NLP 2016-09-30 Fangzhou Liu September 30, 2016 0 Comments Share tweet Subscribe Click here to subscribe to our daily newsletter of top headlines.