In 2005, the late writer David Foster Wallace delivered a    now-famous commencement    address. It starts with the story of the fish in water, who    spend their lives not even knowing what water is. They are    naively unaware of the ocean that permits their existence, and    the currents that carry them.  
    The most important education we can receive, Wallace goes on to    explain, isnt really about the capacity to think, but rather    about the choice of what to think about. He talks about    finding appreciation for the richness of humanity and society.    But it is the core concept of meta-cognition, of examining and    editing what it is that we choose to contemplate, that has    fixated me as someone who works in the tech industry.  
    As much as code and computation and data can feel as if they    are mechanistically neutral, they are not. Technology products    and services are built by humans who build their biases and    flawed thinking right into those products and serviceswhich in    turn shapes human behavior and society, sometimes to a    frightening degree. Its arguable, for example, that online    medias reliance on     clickbait journalism, and     Facebooks role in spreading fake news or otherwise    sensationalized stories influenced the results of the 2016    US presidential election. This criticism is far from    outward-facing; it comes from a place of self-reflection.  
    I studied engineering at Stanford University, and at the time I    thought that was all I needed to study. I focused on    problem-solving in the technical domain, and learned to see the    world through the lens of equations, axioms, and lines of code.    I found beauty and elegance in well-formulated optimization    problems, tidy mathematical proofs, clever time- and    space-efficient algorithms. Humanities classes, by contrast, I    felt to be dreary, overwrought exercises in finding meaning    where there was none. I dutifully completed my general    education requirements in ethical reasoning and global    community. But I was dismissive of the idea that there was any    real value to be gleaned from the coursework.  
    Upon graduation, I went off to work as a software engineer at a    small startup, Quora, then composed of only four people. Partly    as a function of it being my first full-time job, and partly    because the company and our producta question and answer    sitewas so nascent, I found myself for the first time deeply    considering what it was that I was working on, and to what end,    and why.  
    As my teammates and I were    building Quora, we were also simultaneously defining what it    should be, whom it would serve, and what behaviors we wanted to    incentivize amongst our users.I was no longer operating in a    world circumscribed by lesson plans, problem sets and    programming assignments, and intended course outcomes. I also    wasnt coding to specs, because there were no specs. As my    teammates and I were building the product, we were also    simultaneously defining what it should be, whom it would serve,    what behaviors we wanted to incentivize amongst our users, what    kind of community it would become, and what kind of value we    hoped to create in the world.  
    ]I still loved immersing myself in code and falling into a    state of flowthose hours-long intensive coding sessions where    I could put everything else aside and focus solely on the    engineering tasks at hand. But I also came to realize that such    disengagement from reality and societal context could only be    temporary.  
    The first feature I built when I worked at Quora was the block    button. Even when the community numbered only in the thousands,    there were already people who seemed to delight in being    obnoxious and offensive. I was eager to work on the feature    because I personally felt antagonized and abused on the site    (gender isnt an unlikely reason as to why). As such, I had an    immediate desire to make use of a blocking function. But if I    hadnt had that personal perspective, its possible that the    Quora team wouldnt have prioritized building a block button so    early in its existence.  
    Our thinking around anti-harassment design also intersected a    great deal with our thinking on free speech and moderation. We    pondered the philosophical questionalso very relevant to our    productof whether people were by default good or bad. If    people were mostly good, then we would design the product    around the idea that we could trust users, with controls for    rolling back the actions of bad actors in the exceptional    cases. If they were by default bad, it would be better to put    all user contributions and edits through approvals queues for    moderator review.  
    We pondered the philosophical    questionalso very relevant to our productof whether people    were by default good or bad.We debated the implications    for open discourse: If we trusted users by default, and then we    had an influx of low quality users (and how appropriate was    it, even, to be labeling users in such a way?), what kind of    deteriorative effect might that have on the community? But if    we didnt trust Quora members, and instead always gave    preference to existing users that were known to be high    quality, would we end up with an opinionated, ossified,    old-guard, niche community that rejected newcomers and new    thoughts?  
    In the end, we chose to bias ourselves toward an open and free    platform, believing not only in people but also in positive    community norms and our ability to shape those through    engineering and design. Perhaps, and probably, that was the    right call. But weve also seen how the same bias in the design    of another, pithier public platform has empowered and elevated    abusers, harassers, and trolls to levels of national and    international concern.  
    At Quora, and later at Pinterest, I also worked on the    algorithms powering their respective homefeeds: the streams of    content presented to users upon initial login, the default    views we pushed to users. It seems simple enough to want to    show users good content when they open up an app. But what    makes for good content? Is the goal to help users to discover    new ideas and expand their intellectual and creative horizons?    To show them exactly the sort of content that they know they    already like? Or, most easily measurable, to show them the    content theyre most likely to click on and share, and that    will make them spend the most time on the service?  
    It worries me that so many of    the builders of technology today are people who havent spent    time thinking about these larger questions.Ruefullyand with some    embarrassment at my younger selfs condescending attitude    toward the humanitiesI now wish that I had strived for a    proper liberal arts education. That Id learned how to think    critically about the world we live in and how to engage with    it. That Id absorbed lessons about how to identify and    interrogate privilege, power structures, structural inequality,    and injustice. That Id had opportunities to debate my peers    and develop informed opinions on philosophy and morality. And    even more than all of that, I wish Id even realized that these    were worthwhile thoughts to fill my mind withthat all of my    engineering work would be contextualized by such subjects.  
    It worries me that so many of the builders of technology today    are people like me; people havent spent anywhere near enough    time thinking about these larger questions of what it is that    we are building, and what the implications are for the world.  
    But it is never too late to be curious. Each of us can choose    to learn, to read, to talk to people, to travel, and to engage    intellectually and ethically. I hope that we all do soso that    we can come to acknowledge the full complexity and wonder of    the world we live in, and be thoughtful in designing the future    of it.  
    Follow Tracy on     Twitter. Learn how to     write for Quartz Ideas. We welcome your comments at    ideas@qz.com.  
Visit link:
A leading Silicon Valley engineer explains why every tech worker needs a humanities education - Quartz