Nicole (
trickykitty) wrote2014-09-24 08:11 am
![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
Entry tags:
What Does Noise Look Like?
I keep thinking one of the biggest obstacles to overcome in programming isn't figuring out how to find patterns, but understanding what "noise" looks like to the brain.
We know we adapt and start to discern. When a baby is born, all the baby sees is "noise", even though later that baby will grow up to see colors and shapes and objects. The same holds true for all the major senses.
While I was off work last week, I not only watched a lot of volcano videos, but I also watched a lot of videos of people getting their cochlear implants turned on for the first time (as well as a couple days' worth of other random videos, which I might get around to talking about later). Something was explained to a couple people that made perfect sense. Not all doctors explained it to all patients, although they might have been briefed on it prior to the activation day/video moment. Either way, I think it should be explained to everyone.
I work in a dark office, and there are times when I have to go outside for something. The sun is so bright it hurts my eyes and I can't see. My sense of sight is "screaming" into my mind along with probably a level of physical pain from the constriction of the pupils. Either way, it hurts until my eyes can acclimate and adjust (and I run back inside and grab my polarized sunglasses).
Well, that's sometimes what sound is like for a person who was born deaf and suddenly has their implant turned on. It's really loud, it might feel a bit like their brain screaming at them, and it's all noise at first. They might have a hard time knowing what they are hearing, although they know they are going to be hearing the doctor and loved ones in the room speaking early on, so those connections are easily and quickly made - the sound of voices combined with knowledge of words and what they preconceived the words to sound like in their minds, especially if they were people who attempted to talk rather than just sign previously. They don't understand that the hum that they are hearing is the A/C unit or the fans on the desktop PC running. If the room is mostly quiet, they can make out the voice of the other people in the room, but what are they to make of those other sounds - the sounds of people in the office next door or down the hallway - the sound of someone's cell phone ringing, for instance? It's quite a lot to take in. Not only that, but very quickly the newly activated neurons start to acclimate and cope, and the perceived volume of the sound begins to go down somewhat. The learning doesn't stop there, though. It takes a long time for a no-longer deaf person to learn about all the sounds that us hearing folks have been hearing for a couple decades.
This can be applied to my interest in child development. Sights, sounds, and even the sense of touch start off as nothing but noise. We know there's an area of the human brain dedicated to processing human faces, but we're not completely sure if we're born with it doing such processing or if it would process other things like the anatomy of trees were we to be raised by them instead of by other humans. It's possible that area is dedicated to imprinting, rather than human face recognition, and it just so happens that we tend to imprint on other humans. That would require a lot of baby MRI testing (something that's really hard to do) and comparing that with animal MRI testing (probably even more difficult than trying to keep babies still) in order to get a full understanding of how that area first starts off as a noise processing area to gradually turning into a human face processing area. Those results could then be compared to other animals' brains, to see what that area processes for them (which, in theory, would be the faces and features of their own species as well). The final test would be to MRI the brain of a "wild child" - a human raised in the wild rather than within human civilization. It could be that "all humans look alike" to the wild child, but that each ape/monkey looks different, because maybe that child developed a better sense of monkey faces in that area rather than human faces. While we know that area in the mind of a civilized human is dedicated to processing human faces, there's still questions about how it forms in babies.
There's also an area that specifically deals with the parts of the body, called the homunculus. It plays a big part in your somatosensory system (your overall sense of touch) and proprioception (your sense of what your body is doing - sitting, standing, walking, etc.). Again, the signals being sent to your brain at birth are a jumble of noises. These areas build in strength as patterns are made, and unused connections die off during very early childhood. The homunculus has been implicated as the primary reason for the phantom limb sensation, as those neurons may still be activated that were once previously "assigned" to process sensory data about that appendage. In some cases, MRIs have shown that an area previously assigned to an amputated limb gradually reassigned to the body part closest to it on the homunculus "body chart". The act of amputating the limb creates an almost noise effect similar to the one I'm discussing in general, except it's more like a silencing effect. But some of those neurons never completely give up. They keep holding on with some assumption that maybe the amputated limb will return. They have memory, and they keep that memory while adding new memories on top. The activation of those neurons are still governed by the memory of the other neighboring neurons, unlike the neurons of a newborn which are most likely lighting up all over the place while they get figured out and settled in.
Because baby MRI testing is so difficult, it's hard to say what sensory noise looks like inside the baby's brain. A study done on kittens confirmed that with only vertical line stimuli being presented to the kittens (they had to wear little binocular-like gadgets on their heads for a while), they never learned what horizontal lines looked like and couldn't tell things like where the tables end or where steps up and down were. The brain was not pre-programmed to process horizontal and vertical, or if it was, the areas for processing horizontal were not stimulated enough. The kittens had to learn about those separately after the gadgets were removed and their eye became exposed to them (along with some sad bumps and bruises while they ran into things and fell off of things, more so than normal - poor kitties). There's tons of funny home videos of kittens misjudging distances and falling down rather than reaching the ledge they're jumping towards. It's a learning curve. (And from experience living with lots of cats, not every cat learns it permanently.)
So, we have a great understanding that over time the brain recognizes patterns (it is, as I've said many times before, nothing more than a pattern recognition program), but we don't have a very clear understanding of what noise looks like prior to those patterns getting recognized. I can only imagine that a whole heck of a lot of neurons are firing off in all sorts of random directions while trying to figure things out. No wonder why babies cry when they are first born. That's quite a lot of "loud" sensory noise they are receiving.
"Why do my eyes hurt?"
"Because you've never used them before."
There was one video of a cochlear recipient that spoke about how while walking the sound changed because she was walking on leaves. She had to ask her husband/boyfriend (I can't recall now) what was that sound she was hearing. She wasn't even aware in that moment that it coincided with her foot steps (and his steps next to hers). She had to make a new neural connection that the change in sound corresponded with her footsteps and indicated a change in flooring, something that hearing folks take for granted and that blind folks rely quite heavily on when determining location. Echo-location is also still nigh impossible with only a single implant. Two implants, one in both ears, is what allows for stereophonic sound. Even once a recipient has an implant, they still have no clue from where a sound may be emanating until they can learn some level of echo-location. Trying to re-explain the doppler effect once the person can hear is still a bit challenging.
These thoughts brought to you by my exposure to xkcd.
We know we adapt and start to discern. When a baby is born, all the baby sees is "noise", even though later that baby will grow up to see colors and shapes and objects. The same holds true for all the major senses.
While I was off work last week, I not only watched a lot of volcano videos, but I also watched a lot of videos of people getting their cochlear implants turned on for the first time (as well as a couple days' worth of other random videos, which I might get around to talking about later). Something was explained to a couple people that made perfect sense. Not all doctors explained it to all patients, although they might have been briefed on it prior to the activation day/video moment. Either way, I think it should be explained to everyone.
I work in a dark office, and there are times when I have to go outside for something. The sun is so bright it hurts my eyes and I can't see. My sense of sight is "screaming" into my mind along with probably a level of physical pain from the constriction of the pupils. Either way, it hurts until my eyes can acclimate and adjust (and I run back inside and grab my polarized sunglasses).
Well, that's sometimes what sound is like for a person who was born deaf and suddenly has their implant turned on. It's really loud, it might feel a bit like their brain screaming at them, and it's all noise at first. They might have a hard time knowing what they are hearing, although they know they are going to be hearing the doctor and loved ones in the room speaking early on, so those connections are easily and quickly made - the sound of voices combined with knowledge of words and what they preconceived the words to sound like in their minds, especially if they were people who attempted to talk rather than just sign previously. They don't understand that the hum that they are hearing is the A/C unit or the fans on the desktop PC running. If the room is mostly quiet, they can make out the voice of the other people in the room, but what are they to make of those other sounds - the sounds of people in the office next door or down the hallway - the sound of someone's cell phone ringing, for instance? It's quite a lot to take in. Not only that, but very quickly the newly activated neurons start to acclimate and cope, and the perceived volume of the sound begins to go down somewhat. The learning doesn't stop there, though. It takes a long time for a no-longer deaf person to learn about all the sounds that us hearing folks have been hearing for a couple decades.
This can be applied to my interest in child development. Sights, sounds, and even the sense of touch start off as nothing but noise. We know there's an area of the human brain dedicated to processing human faces, but we're not completely sure if we're born with it doing such processing or if it would process other things like the anatomy of trees were we to be raised by them instead of by other humans. It's possible that area is dedicated to imprinting, rather than human face recognition, and it just so happens that we tend to imprint on other humans. That would require a lot of baby MRI testing (something that's really hard to do) and comparing that with animal MRI testing (probably even more difficult than trying to keep babies still) in order to get a full understanding of how that area first starts off as a noise processing area to gradually turning into a human face processing area. Those results could then be compared to other animals' brains, to see what that area processes for them (which, in theory, would be the faces and features of their own species as well). The final test would be to MRI the brain of a "wild child" - a human raised in the wild rather than within human civilization. It could be that "all humans look alike" to the wild child, but that each ape/monkey looks different, because maybe that child developed a better sense of monkey faces in that area rather than human faces. While we know that area in the mind of a civilized human is dedicated to processing human faces, there's still questions about how it forms in babies.
There's also an area that specifically deals with the parts of the body, called the homunculus. It plays a big part in your somatosensory system (your overall sense of touch) and proprioception (your sense of what your body is doing - sitting, standing, walking, etc.). Again, the signals being sent to your brain at birth are a jumble of noises. These areas build in strength as patterns are made, and unused connections die off during very early childhood. The homunculus has been implicated as the primary reason for the phantom limb sensation, as those neurons may still be activated that were once previously "assigned" to process sensory data about that appendage. In some cases, MRIs have shown that an area previously assigned to an amputated limb gradually reassigned to the body part closest to it on the homunculus "body chart". The act of amputating the limb creates an almost noise effect similar to the one I'm discussing in general, except it's more like a silencing effect. But some of those neurons never completely give up. They keep holding on with some assumption that maybe the amputated limb will return. They have memory, and they keep that memory while adding new memories on top. The activation of those neurons are still governed by the memory of the other neighboring neurons, unlike the neurons of a newborn which are most likely lighting up all over the place while they get figured out and settled in.
Because baby MRI testing is so difficult, it's hard to say what sensory noise looks like inside the baby's brain. A study done on kittens confirmed that with only vertical line stimuli being presented to the kittens (they had to wear little binocular-like gadgets on their heads for a while), they never learned what horizontal lines looked like and couldn't tell things like where the tables end or where steps up and down were. The brain was not pre-programmed to process horizontal and vertical, or if it was, the areas for processing horizontal were not stimulated enough. The kittens had to learn about those separately after the gadgets were removed and their eye became exposed to them (along with some sad bumps and bruises while they ran into things and fell off of things, more so than normal - poor kitties). There's tons of funny home videos of kittens misjudging distances and falling down rather than reaching the ledge they're jumping towards. It's a learning curve. (And from experience living with lots of cats, not every cat learns it permanently.)
So, we have a great understanding that over time the brain recognizes patterns (it is, as I've said many times before, nothing more than a pattern recognition program), but we don't have a very clear understanding of what noise looks like prior to those patterns getting recognized. I can only imagine that a whole heck of a lot of neurons are firing off in all sorts of random directions while trying to figure things out. No wonder why babies cry when they are first born. That's quite a lot of "loud" sensory noise they are receiving.
"Why do my eyes hurt?"
"Because you've never used them before."
There was one video of a cochlear recipient that spoke about how while walking the sound changed because she was walking on leaves. She had to ask her husband/boyfriend (I can't recall now) what was that sound she was hearing. She wasn't even aware in that moment that it coincided with her foot steps (and his steps next to hers). She had to make a new neural connection that the change in sound corresponded with her footsteps and indicated a change in flooring, something that hearing folks take for granted and that blind folks rely quite heavily on when determining location. Echo-location is also still nigh impossible with only a single implant. Two implants, one in both ears, is what allows for stereophonic sound. Even once a recipient has an implant, they still have no clue from where a sound may be emanating until they can learn some level of echo-location. Trying to re-explain the doppler effect once the person can hear is still a bit challenging.
These thoughts brought to you by my exposure to xkcd.