urbanos

 

ABOUT

For my thesis at ITP I created an experimental research and design fiction project that uses the analogy of the computer operating system to re-imagine the “smart city”, carried out through three augmented reality case studies.

You can watch my presentation that goes into more detail on each of these case studies on the ITP Thesis 2020 Archive.

abstract

As the smart city movement continues to grow, I believe it is imperative to think deeply and critically about their design and underlying architectures, data structures, communication protocols and interfaces - essentially their operating systems. I argue that it is worth questioning some of these systems we accept at “neutral” and think about new computing approaches - ones that foreground equity, environmental sustainability, and maintenance/care - as we integrate more and more technology into the fabric of the physical space we inhabit. This research project consists of three case studies and a user manual. Each case study uses a specific smart city component found in my neighborhood (Bed-Stuy, Brooklyn) or NYU’s Brooklyn campus (Downtown Brooklyn) as a way to imagine different parts of a speculative Urban_OS ecosystem. 

The case studies re-think and re-design smart city hardware, interfaces and memory systems using augmented reality. The manual ties these case studies together with design principles that were developed by transposing the original 17 principles from UNIX, one of the most ubiquitous and oldest operating systems. The manual was additionally informed by a wide variety of ongoing interviews with experts in the community, at the New York City government, at IoT companies, and at academic and research institutions. These experimental case studies and manual make up an ongoing topology meant to generate conversation around the ethics, aesthetics, and methods in smart city design and technology. This project challenges designers, city planners, academics and policymakers to ask: if we are embedding computing into our urban landscape, what values are we embedding by doing so?

case studies


Case Study 1: Hardware

In the first case study, I redesign the outer hardware of the LinkNYC wifi kiosks for the average pedestrian. A critique of generative design, this experiment asks how these new design processes impact our embodied experiences with city infrastructure. I combine community input with machine learning tools as well as the Urban_OS design principles.

Many companies, academics and organizations are experimenting with incorporating AR into co-design processes. For example, IDEO CoLab prototyped an AR platform for residents and planners to design together to improve the experience of development permit/land use notices, and the MIT Media Lab City Science group uses hybrid tangible-digital platforms to see high level city planning patterns and impacts of decisions. With this case study I’d like to ask: could a tool be made that draws on the strengths of both machine learning and local knowledge? Could there be a design process that requires both human and AI to work together? The purpose of this case study is not to arrive at a solution, but to try a different approach.

The result is 3D models for future Links that can be viewed on location in AR.

Case Study 2: Interface

In the second case study I re-imagine the Rain Garden (part of NYC’s green infrastructure plan) as an interface for the Rain Garden Stewards and local residents. A critique of the optimization dashboard, this experiment questions what is considered urban intelligence and asks how the location and scale at which data are encountered matters.

What is striking about Rain Gardens is that they are invisible infrastructure. Underneath their surface are “imperceptible flows”, a living spongy world made of dirt, rocks, and gravel doing hidden work as part of a larger water ecosystem (building on a concept from Spongiform by anthropologist Andrea Ballestero). There is little publicly available data on Rain Gardens, and what is available is separate from the physical garden, usually located in an open data portal or report produced by the city. In the context of the larger water system, this experiment aims to help the passerby connect the dots between the different urban systems by redirecting attention to expand their previously conceived boundaries of citizenship and aid in understanding of non-human rhythms and cycles in our cities. Could in interface heighten our sensitivity of perception around the rain garden?

The final version is an AR app that visualizes on-site water, plant and air data.

Case Study 3: Memory

For the final case study I look at the BigBelly trash bins and develop a standard future IoT device specification for data storage to be used by city agency field workers. A critique of digital transparency in the smart city movement, through this project, I ask what policy implications might come from a framework that aids in understanding the complex social dimensions of objects.

As Bruce Sterling says in his book Shaping Things, “Properly understood, a thing is not merely a material object, but a frozen technosocial relationship.” These technosocial relationships are formalized through a computational protocol and stored as data. Through this case study I aim to ask: what information would we want to be stored in the BigBelly’s database? Put another way, what values are we designing into our technologies and how could they be accessible?

The speculative database can be viewed and interacted with in AR on the street.

Screen Shot 2021-06-17 at 11.37.10 PM.png