site banner

Small-Scale Question Sunday for April 21, 2024

Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?

This is your opportunity to ask questions. No question too simple or too silly.

Culture war topics are accepted, and proposals for a better intro post are appreciated.

1
Jump in the discussion.

No email address required.

Apropos of a conversation I had with my running partner this weekend, how close is the “pixel“ of smell and taste? I.e. is there any way to break down scent and flavor the same way sight and sound are with pixels and the Fourier transform.

You’re kind of touching on two questions.

The thing about images is that the map is not the territory. Concerns like pixels—resolution—only sneak in to quantify the limits of that map.

A mathematical construct like the Fourier transform doesn’t have that problem. The transform of a pure sine wave is the Platonic ideal of a pair of points. But you can’t make such a pair out of samples. You’re forced to approximate, which gives you a resolution.

So question 1 is “do we have a map to quantify smell?” The answer is yes, but no one can agree which is best. Here’s a more recent study which has a bunch of cool charts showing the perceptual space. There’s also the classic OChem Smells Chart.

Question 2 is how good the resolution is for any of these models. For sound and sight, we’ve done experiments to identify how small of a difference can be recognized. Presumably, something similar has been tried in the smell literature. In theory, you could use one of the Question 1 schema to choose several components of smell. Say “edibility,” “temperature,” and “irritation.” Then test different substances on each axis to estimate resolution. That’d give you a map of possible, distinguishable smells.

I’m going to be lazy and assume the same is true for taste.

I’m also lazy. I agree with everything you said, my argument with him was that the difficulty of simulating taste and smell meant that anything short of direct neurostimulation was likely to be uneconomical.

It appears that humans have between 350 and 400 olfactory receptors, so I suppose once we fully describe them we'll have as good a model for smell as we can get. Taste seems to be a lot simpler, and yet people are still finding new receptors there as well (though having tasted salt licorice I would say that's one better left unstimulated).

I would assume taste is much easier than smell, as there is only a handful of things tastebuds can detect. But then you need to combine that with smell…

I would guess that smell would have to be embedded within a higher-dimensional space than sight or sound? But I'm not certain.

There are languages that have fairly developed abilities to describe smell, just English isn't one of them.

I don't know the answer, but I think the first step would be to try to quantify smell and taste as precisely as sight and sound can be quantified.

Sight and sound are relatively easy to quantify. You can quantify sight as a function that maps (x, y, time) tuples to (r, g, b) color value tuples for example. You can quantify sound as a function that maps (time) tuples to (amplitude) tuples.*

As far as I know, no-one has managed to quantify smell and taste in such a way. However, I could be wrong about that.

*(time) and (amplitude) are tuples with only one item each in them, but I am calling them tuples for the sake of consistency. In mathematical parlance, it's still a tuple even if it has 0 or 1 items.