GV aims to help build a company that can sniff out disease, literally

GV aims to help build a company that can sniff out disease, literally

Alex Wiltschko has what he thinks is a big idea. He wants to build a company that digitizes scent.

It’s a natural step for Wiltschko, who has a PhD in neurobiology from Harvard, where he studied how the brain processes odor. He didn’t wind up in this specific group accidentally, he suggests. It owes to a lifelong “obsession with scent and olfaction” that he came to study alongside Sandeep “Bob” Datta, a Harvard professor who has himself long focused with what happens after one’s sensory neurons pick up a scent.

The researchers were trying to better understand how the human brain works — including why certain scents are tied to memories. For a long time, too, their area of study was dwarfed by the attention that sight and image processing has received over the years. Then came Covid 19, and with it, far more focus on how taste and smell are processed — and lost.

Now the race has begun to better understand and digitize and even recreate scent. Indeed, in July, a neurotech startup called Canaery raised $4 million in seed funding to develop a scent-sensing platform. Moodify, another startup that’s working on the digitization of scent, closed an $8 million round of funding last year, including from Procter & Gamble.

As Datta told Harvard Magazine late last year, “Right now, there’s a lot of intense interest in smell from physicians and from the many millions of patients who’ve had their sense of smell affected. And it has really highlighted, collectively, how little we know about all aspects of our sense of smell.”

Wiltschko is among the select few, for now, who see opportunity in solving these unknowns. His longtime employer, Google, also sees it. After logging close to six years with Google AI, Wiltschko just became an entrepreneur-in-residence (EIR) at GV, the venture arm of the search giant, where, more specifically, he hopes to build a company that can identify disease faster based on specific odor molecules.

It’s a meaningful vote of confidence from GV, which has only appointed five life-science-focused EIRs in its 13-year-history, and also incubated Flatiron Health (which went on to sell to the pharma giant Roche in 2018 for $1.9 billion); the gene-editing company Verve Therapeutics, which went public last year; and Rome Therapeutics, a startup that’s developing therapies for cancer and autoimmune diseases by focusing on portions of DNA that have been largely overlooked by researchers, it says. (Rome has raised at least $127 million over two rounds of funding so far.)

The big question, of course, is what will come of the effort. To learn more about how he’s approaching his mission, we talked yesterday with Wiltschko, who was personable but also reluctant to share too much. Our conversation has been edited lightly for length and clarity.

TC: I haven’t come across something like this before. Are you trying to better understand how to build neural networks based on how people process and compartmentalize information about odors?

AW: Taking a step back, every time computers got a new “sense,” like to see or to hear, society completely changed for the better, right? When we first learned how to store visual images, in the 19th century, and eventually how to store them on computers in the 20th century, all of a sudden, we could do things like take X-rays. We could do things like store memories [of] the visual world. And we didn’t need painters to do it, everybody could do it. We did it again for hearing; we [could make] music [captured in one location] available to the masses.

But computers can’t smell. They have no ability to detect the chemical world [so] we can’t store the really powerful memories that we associate with smell, like the smell of my grandmother’s home. It’s just gone. It only lives in my mind. The smells of people who I love, and of places I’ve been, are completely ephemeral today.

We [also] know that diseases have a smell. We know that different wellness and health states have a smell. Plants, when they’re sick or when they’re healthy, smell different. The amount of information that’s out there in the world that we could potentially act on to make our lives longer, make our lives more joyous, grow more food — that’s really only able to happen inside of living things inside of living noses — if we could automate that, we could have a massive and positive impact on society.

What applications do you have in mind?

I think the North Star for me — and I don’t know how long it’s going to take to get there — is that we become capable of smelling diseases earlier to detect diseases earlier than we currently can. There are lots of stories that are out there — lots of anecdotes and various papers — and research has kind of built up a picture to me that we can smell Parkinson’s much, much earlier than we could otherwise detect it; we can smell diseases much, much earlier. And if we could actually build devices that can turn that information into digital representations, then potentially we could catch diseases earlier and learn how to treat them better.

How do we know that we can detect Parkinson’s earlier than any other way through scent?

There’s not one single slam dunk that we can detect diseases earlier, but there’s a lot of stories that all have their strengths and weaknesses that add up to a clearer picture. For Parkinson’s, there’s a nurse who first reported that she could smell Parkinson’s in her husband before he actually developed it, and they put it to the test. They gathered T shirts that men had worn — half of them with Parkinson’s, half without Parkinson’s — and said, ‘Hey, can you tell which of these T shirts were worn by a person with this disease?’ She got almost all of them except for one, and she said [to the researchers], ‘Actually, I think you’re wrong.’ And that person did end up developing Parkinson’s disease.

They took the story further and tried to isolate exactly what it was that she was smelling. And researchers found the exact material being emitted by the body: this waxy substance called sebum that is excreted by cells on your back. And they found the exact molecules that she was smelling. But it was her nose, it was her ability to take an olfactory picture of the world and turn it into a notion of whether or not someone was sick, that preceded all of that.

If we can ultimately digitize scent as you suggest, do you have any concerns that odors could be manipulated for certain means — perhaps to make people think that they’re in danger when they aren’t, or safe when they’re in danger? There’s much good that comes from new technologies but also second-order effects that we don’t always think through.

It’s definitely important whenever a new area of technology is developed to think through those things, for sure. One area that I think is nascent, it’s not at all clear where it could go, but at least I personally am relaxed by certain scents. I don’t know why. And so I think there’s a lot for us to learn in that space.

Have you studied the effects of COVID-19 on sense of smell?

Me personally? No. But my former mentors have certainly been looking at this very, very closely. We started a lot of this research into what people think things smell like when COVID was just starting. And we had to be very careful, because people would lose their sense of smell when they got COVID. And if you’re studying what people think things smell like, you need to be very, very careful if folks suddenly become anosmic — that’s the term for having lost the sense of smell. And so we had to develop all kinds of new checks and balances into our research protocols.

And now you’ve joined GV with the idea of developing a company. What sorts of resources are available to you? Will you be partnering with some of your former colleagues at Harvard Medical School? I’m assuming you need access to many datasets.

What’s wonderful about starting to work on this idea today versus maybe 10 years ago is that the ecosystem of people who are working on olfaction or scent has grown dramatically. And I think the attention that’s being paid to our sense of smell — because now we’re understanding how important it is when we lose it [has fostered a] much richer ecosystem of folks who are working and thinking about olfaction.

Are there already companies up and running that are trying to do exactly what you hope to do?

It’s a vibrant ecosystem, and there’s lots of folks that are working on different pieces of it. What’s really wonderful about joining GV as an entrepreneur in residence is being able to take the broad view and thinking about how I can have the most impact in the space in digital olfaction.

Can you share more about that path? You mentioned Parkinson’s. Is the idea to focus first on being able to diagnose Parkinson’s, then to build around that, or are you taking a multi-pronged approach?

Harkening back to the other senses, there’s just 1,000 things you could do if you could take the visual world or if you could take all of sound and store it in a computer and analyze it. Those are the two edges of the sword — there are so many opportunities, so many places to start, [but] on the other hand, you have to focus, so that’s where I spend a lot of my time thinking, what’s the right path specifically to chart towards our North Star, which is improving the well being and length of human life?

People offer wildly varying timelines when it comes to when we might see artificial general intelligence. Some say it’s 5 years away. Some say 10 years. Some say 500. What’s your best guess when it comes to how far away we are from digitizing the sense of smell?

It took maybe 100 years to digitize our sense of sight. I think we can compress digitizing our sense of smell into a fraction of that. It’s not going to be easy. It’s going to take a lot of work. But now’s a good time to start.

Source @TechCrunch

Leave a Reply