Stanford d.school’s Lisa Kay Solomon, Carissa Carter and Scott Doorley convened this Wednesday to discuss the importance of design in technology and the implications of creator and consumer decisions on the future. Focusing on three main points — consumer responsibility, representation through education, and prototyping — the speakers addressed what creators and consumers alike can do to better humans’ relationship with technology.
The speakers first sought to emphasize consumer responsibility with the technology we use. It isn’t necessary to learn its inner workings, they said, but it is essential to understand what the technology does and what it can do.
“You don’t have to be the coder, the technologist, but you absolutely need to know what that code can do,” Carter said. “And the point that goes hand in hand with this is who has access to that knowledge.”
In other words, there’s a direct correlation between education and the future of technology. Technology often mirrors the needs of its creators; if the people that are building technology only represent a fraction of the world, that technology may serve a limited demographic.
Carter sought to emphasize the importance of representation in designing technology, compounded with education.
“If we want all of this technology to represent all of us, it needs to be created by all of us,” Carter said. “So how do we do that? How do we build tools and learning experiences to make that possible?”
Speakers presented six key machine-learning algorithms — clustering, classification, regression, reinforcement learning, association and dimensionality reduction — in digestible examples. In these examples, the speakers focused on the importance of applying algorithms to both conceptual and real-world scenarios to encourage representation through education.
For example, a 2013 internet meme surfaced comparing hot dogs and legs; in many scenarios, they looked almost indistinguishable. This is a classic example in which a classification algorithm can be utilized to discern characteristics from one entity to another, telling apart features from hair follicles, shadows and anatomy.
This classification algorithm must be used with a certain kind of dataset, such as ticketing data to predict the location of police stations.
“We haven’t tried every algorithm on, and we can’t,” Carter said. “But it’s as simple as mad libs. Filling in is one way that we can start to understand some of these mischievous materials, how they actually work, and understanding their consequences.”
The speakers also said that building early prototypes, or test samples, is essential to minimizing possible negative effects of products. A program using cluster algorithms, for example, could utilize a user’s social media feed data to reveal personal information from sexuality, occupation or location to third party users. Scott sought to emphasize here the importance of a test run: first making early samples of a product, then afterward, updating the product with second and third versions.
“The point of thinking about what’s to come and understanding the technology that you’re working or using is critical, because of this idea that everything leaves a wake,” Scott said. “So anything we make becomes the foundation for the next thing we make, and the next generation.”
The concept of the effects of individual actions is perhaps made even more essential in the era of mass consumerism.
“As an example, one car is a convenience; millions over decades is climate change; one rabbit-hole conspiracy video is a wasted evening, but millions of them are a stalled election.” Scott said. “So you just have no idea what’s going to happen until it scales.”
According to the presenters, this idea of overlooking individual actions and decisions — until the point the larger consequence metastasizes into a future generation — is almost a given in our era of technology. Though new technology continues to morph after it is created, the speakers emphasized the importance of the beginning of a new era, where it’s nearly impossible to predict the impact of our creations. They change as fast as creators make them.
The speakers added, however, that this doesn’t mean that our generation can absolve responsibility for the issues of the future.
“We really are now seeing — because the evidence and because of where we are living right now — that we can’t let it be to chance,” Solomon said. “That we have to step up and take responsibility not just for building it, but to push ourselves to imagine a range of possible futures, to look beyond observable events and to really think about the patterns and trends.”
As the end of 2020 nears, our dependency on technology has been exacerbated with remote work and school. Yet it isn’t Zoom or Google we should be concerned about, according to Solomon — it’s how we act upon these technologies.
“We are the technology we should be most concerned about,” Solomon said. “The rate of change has finally surpassed our ability to understand what is really afoot. For better or worse, we’re the operating system that has the final say.”
Contact Christina Pan at christinapan1 ‘at’ gmail.com.