Importance of vision

Vision is our dominant sense

problem to solve in ARO : visual closure

Posted in Cool stuff, Not to publish | Comments Off

Error codes in cuda

Error codes can be found at driver_types.h file

Posted in programming-general | Comments Off

Torch issues


GPU ClassNLLCriterion.lua:34: bad argument #1 (field weights does not exist)

1. update the torch by,
2. second I link the cuda to cuda-7.5,
3. run the torch's ./ again.

Posted in torch | Comments Off

How Deaf People Think


Today I found outhow deaf people think in terms of their “inner voice”.  It turns out, this varies somewhat from deaf person to deaf person, depending on their level of deafness and vocal training.

Those who were born completely deaf and only learned sign language will, not surprisingly, think in sign language.  What is surprising is those who were born completely deaf but learn to speak through vocal training will occasionally think not only in the particular sign language that they know, but also will sometimes think in the vocal language they learned, with their brains coming up with how the vocal language sounds.  Primarily though, most completely deaf people think in sign language.  Similar to how an “inner voice” of a hearing person is experienced in one’s own voice, a completely deaf person sees or, more aptly, feels themselves signing in their head as they “talk” in their heads.

For those deaf people who are not completely deaf or wear devices to allow them to hear somewhat, they will often experience more vocal language in their “inner voice” in proportion to how much they can hear.

Interestingly, deafness is significantly more serious than blindness in terms of the effect it can have on the brain.  This isn’t because deaf people’s brains are different than hearing people, in terms of mental capacity or the like;  rather, it is because of how integral language is to how our brain functions.   To be clear, “language” here not only refers to spoken languages, but also to sign language.  It is simply important that the brain have some form of language it can fully comprehend and can turn into an inner voice to drive thought.

Recent research has shown that language is integral in such brain functions as memory, abstract thinking, and, fascinatingly, self-awareness.  Language has been shown to literally be the “device driver”, so to speak, that drives much of the brain’s core “hardware”.  Thus, deaf people who aren’t identified as such very young or that live in places where they aren’t able to be taught sign language, will be significantly handicapped mentally until they learn a structured language, even though there is nothing actually wrong with their brains.  The problem is even more severe than it may appear at first because of how important language is to the early stages of development of the brain.  Those completely deaf people who are taught no sign language until later life will often have learning problems that stick with them throughout their lives, even after they have eventually learned a particular sign language.

It is because of how integral language is to how our brains develop and function that deaf people were once thought of as mentally handicapped and unteachable.   One can see how observing someone who can’t communicate due to lacking any language and who lacks much self awareness might appear this way.  However, in recent history, up until the 1970s, it was still thought that deaf people were somehow mentally handicapped.

How could this be when they had various sign languages and even vocal training to allow their brains to develop and function properly?  Well, the problem stemmed from the fact that in the 1880s it was decided that deaf people should not use sign language; rather, they should be forced to use spoken language almost exclusively.  This seems reasonable enough on the surface as deaf people are fully capable of learning spoken language and this would allow them to more completely integrate into the hearing world.  The problem with this was only recently discovered and indeed many of the negative implications are only just now being understood.

It turns out, completely deaf people who are forced to use only spoken language are only slightly better off than those who know no language, in terms of their brain functions.  Recent research has shown the brains of the completely deaf never fully associate spoken language in the way sign language gets ingrained in their brains as a language; principally they never develop an “inner voice”, which is necessary for our brains to process information.

They do gain significantly more sense of self and better memory and the like over those who have no language, but in this state, they will never fully reach their brain’s potential as in when they learn sign language.  “There is still a lot of debate over what are the minimal levels of exposure needed to stimulate the language centers. But it is clear that deaf children need early experience of some sort of language if they are going to be good communicators in later life,” says Professor David Wood, a leading deaf educationalist at Nottingham University.

Because of these findings, the “oralist” method of teaching the deaf that had endured for just under 100 years is being rapidly phased out in favor of a “bilingual” education where sign language is taught as early as possible and vocal language is taught as a sort of secondary language.  “Bilingualism is still very much a hot potato. We have come in for a lot of flak and been accused of pushing deaf children into a signing ghetto. Yet the deaf had a big price to pay when the old methods failed. Not only could they not communicate, but they were left without a code to think in. We can no longer ignore what the research tells us,” says Miranda Pickersgill, chief of deaf services for Leeds Local Education Authority.

Bonus Facts:

  • On the “self awareness being tied to language” note, I found this quote from Helen Keller interesting: “Before my teacher came to me, I did not know that I am. I lived in a world that was a no-world. I cannot hope to describe adequately that unconscious, yet conscious time of nothingness. (…) Since I had no power of thought, I did not compare one mental state with another.”   Hellen Keller, 1908: quoted by Daniel Dennett, 1991, Consciousness Explained. London, The Penguin Press. pg 227
  • The theory behind why those completely deaf people taught only vocal language don’t properly develop an “inner voice” is that without being able to associate sounds with the phonemes and complete words, the language is too abstract and a brain without language already struggles with things that are abstract.  Those deaf people who learn sign language, however, have less trouble with comprehending vocal language and, as noted earlier, have the ability to have an “inner voice” that speaks.  This is thought to be because once the brain has a structured language from which to “run” off of, similar to a computers operating system, abstract concepts can be more easily grasped; thus, comprehending vocal language sounds and text becomes significantly easier.
  • It turns out, our brains treat sign language exactly as it treats spoken language, even using the exact same part of the brain to process it.  This is counterintuitive as you’d think the brain would use some part of the right hemisphere with sign language being visual.  It turns out though that it uses the same portion of the left hemisphere to process sign language as it does for vocal language in the hearing.
  • Interestingly, if you take a deaf person and make them grip something hard with their hands while asking them to memorize a list of words, this has the same disruptive effect as making a hearing person repeat some nonsense phrase such as “Bob and Bill” during memorization tasks.
  • The sign language most often used in the United States is American Sign Language (ASL).  This is very different from many other types of sign languages like British Sign Language (BSL), which is completely unintelligible to someone who uses ASL and vice-verse.  One of the big differences between ASL and many other sign languages is that ASL primarily uses only hand gestures, whereas most types of sign languages, such as BSL, rely heavily on facial expressions and other physical expressions outside of hand and finger gestures. (Based on comments, it appears this factoid is out of date and this is no longer the case with facial expressions now being an integral part of ASL.)
  • While there are many standardized sign languages used, unlike spoken languages, there are also literally thousands of simple sign “languages” used among family units.  This is particularly the case with non-deaf parents that have a deaf child.  They will often develop numerous home signs and some sort of structured system of using the signs to communicate.  These simple “languages” are typically called homesign or kitchensign.
  • Ever wondered how the deaf wake themselves up in the morning?  Well, there are many different ways, none being quite as fool proof as a blaring loud noise is among the hearing.  The most foolproof method, outside of someone just coming to wake you up, is a very strong vibrating accessory attached to a special alarm clock.  The attachment is then generally placed under the pillow or on the bed near the person.  Another common method is an alarm clock that has a bright light attached that points at the sleeper.  When the alarm goes off, it flashes brightly on and off.  Due to the fact that the majority of deaf people are very heavy sleepers, as you might expect, this method doesn’t work as well as you might think.  Yet another method is programming a house or room heater to heat the room to high temperatures around the time the person needs to get up.  This, again, isn’t the best method for heavy sleepers and can result in the other downside of sweaty blankets and sheets.
  • Contrary to popular belief, most sign languages bear little resemblance to the spoken language from the same area as the particular sign language.  In other words, in the majority of cases, the various sign languages used were not developed from spoken languages.  In fact, American Sign Language resembles Chinese in form more than it does English in terms of a single gesture often represent a phrase or whole idea, rather than a single word.   Further, most sign languages were invented by the deaf and thus bear little real resemblance to spoken language in form.
  • Similar to spoken language, sign languages have accents.  Typically these manifest themselves in small variations in how one does a gesture or the like.  For instance, in British Sign Language, the gesture for “car” is holding up two fists at the 10 o’clock and 2 o’clock position and wobbling slightly as if steering a wheel.  However, in Newcastle, England, the fists are held together with fingers extended slightly, but with the same motion.  In these cases, the overall gesture is very similar within a specific sign language,  but the exact gesture varies from region to region.
  • In addition to the “region to region” accents, deaf people can typically readily identify those deaf people who began signing later in life by their “late learner” accent.
  • Interestingly, one of the marks of a “southern” American Sign Language (ASL) accent is that southern signers sign much slower than northern signers, essentially mimicking the southern drawl in sign.
  • Along with differing accents, there are numerous dialects, among various deaf communities, that technically use the same sign language.  This is largely due to the various homesigns that find their way into these groups particular dialects.   For instance, it is estimated that only about 80% of the British Sign Language (BSL) is universally understood by all users of BSL.  The other 20% or so of the language varies from region to region.
  • Deaf people typically clap by striking their hands together only when surrounded by hearing people.  Otherwise, they use the more expressive motion of raising their hands and twisting their wrists rapidly to “clap”.
  • The earliest record of sign language being used dates all the way back to the fifth century B.C., in Plato’s Cratylus, where Socrates states: “If we hadn’t a voice or a tongue, and wanted to express things to one another, wouldn’t we try to make signs by moving our hands, head, and the rest of our body, just as dumb people do at present?”
  • In 1880 and international congress of deaf teachers voted to abandon sign language and teach only oral language to the deaf.  This was in an effort to allow them to integrate into hearing society.  This oral method has endured until very recently when research has demonstrated just how catastrophic this is on the proper cognitive development of deaf people.
  • The first research showing the failure of the oral method was done by Cambridge Professor Ruban Conrad in the 1970s who tested reading ability in deaf teenagers trained in the oral method.  He discovered that while the average deaf teenager could read individual words at about an eight year old level, they read without understanding, particularly when it came to taking in the meaning of a full sentence.  The problem was that they had not adequately been able to develop an “inner voice” due to being restricted to oral language which they could not hear.  Thus, without the inner voice, there was no auditory imagery to linger in short term memory while they took in the whole sentence.
  • About one person out of every one thousand born is born completely deaf.
  • It is quite common for deaf people, when they are dreaming, to not only communicate in their dreams using sign language, but also to communicate telepathically and sometimes even verbally even though they may not know how to speak verbally in the waking world.  One deaf person notes: “I think most common one would be telepathy but there have been times when I wake up signing. People who have slept over at my home have heard me talking verbally in my sleep. I recall one ex boyfriend who said that he could understand what I was saying perfectly! How ironic, I speak better in my sleep than I do while awake.”
  • The critical age for learning language is around 21 to 36 months old.  During this period, much of the cognitive infrastructure in a person’s brain is developed and it is thought, much of it is developed as a result of learning language.
  • Research has shown that deaf people are able to learn a sign language, such as ASL, significantly faster than the non-deaf learn spoken language.  One deaf orphan, who was three at the time he was placed with his new family and had no knowledge of ASL, recounts the following on this issue:  “On the way home from the airport, which was about four hours drive, my mom brought a children’s book and she was ready to teach me signs right there in the car. My first education happened in a car! We were sitting in the backseat, with my dad on my left and my mom on the right. My uncle was doing the driving. My parents showed me how to sign those pictures in the book like animals, trees, etc. My mom said by the time we got home, we had finished the whole book and I’d learn all the signs from the book. After one week, I had learned enough signs that we were signing normally as if we were together since I was born. One week was all it took…”
  • Sign Languages are not often written due to the incredibly complexity of trying to replicate the non-sequential nature of signing.  For instance, in spoken language, if you wanted to say “I walked home” and then say something about your walk home;  perhaps, “It was nice out and I enjoyed the walk.”  You’d need to add that sentence in a sequential fashion.  In sign languages such as BSL, you’d use your hand gestures, facial expressions, etc. to express things like that the walk was on a dirt road; it was nice out; and you enjoyed the walk, with it all expressed simultaneously.  This non-sequential nature of sign languages allows for faster and more detailed communication, but has the drawback of being ridiculously hard to put into print, though attempts have been made.
  • The first modern treatise on the subject of sign language was published in 1620 by Juan Pablo Bonet called: Reducción de las letras y arte para enseñar a hablar a los mudos (‘Reduction of letters and art for teaching mute people to speak’)
  • When reading articles written by people who are actively members of the deaf culture, you’ll often notice when they write, they sometimes capitalize the “D” in “Deaf” and sometimes not.  What’s going on here is that people who are referred to as “deaf” (with a little “d”) are people who are medically deaf, but not an active member of the deaf culture and may even shun it to a certain extent or be completely oblivious to its existence. In other words, they generally hang out with and associate with hearing people, even though they are medically deaf.  “Deaf” (with a big D), on the other hand, refers to the deaf culture, people who actively embrace their deafness and are members of the deaf community or culture, even to the point that sometimes people who are Deaf and have the option, may shun getting a medical implant that would allow them to hear as up to that point, they may have lived their whole life in the “Deaf world” and have no interest in being able to live in the “hearing world”.
  • On a related note, you also may occasionally read that the distinction between big “D” and little “d” in the deaf culture has to do with the level of hearing the person has, with “big D” implying that the person is totally deaf, whereas “little d” implies only partial deafness or someone who has a medical device that allows them to hear.  This, however, is a “big D vs little d” distinction that is no longer generally used and the above listed definition is drastically more popular. Although, of course, there is sometimes a correlation between the two definitions in that people who are fully deaf often gravitate towards other people who are deaf in their social lives and people who have some hearing may well gravitate more towards a “hearing” culture. In any event, one commenter below summed the “big D vs. little d” distinction quite nicely, “D/deaf… refers to being culturally Deaf (D) vs. being ‘medically’ deaf (d).”

Expand for References:

Posted in Cool stuff, Paper Reviews, research, Uncategorized | Comments Off

Installing cuda

Installing CUDA in ubuntu 14.04


commands for drivers

Posted in caffe, linux, programming-general, research | Comments Off

things to read

Posted in research | Comments Off

Useful Linux Commands

Posted in linux | Comments Off


CVPR Tutorial


White Paper

Posted in programming-python | Comments Off

thinks to consider (caffe)

check status GPU:  >> nvidia-smi

Pascal data

Load data as database


Creating a layer


Posted in caffe | Comments Off

different version of python in a so file

After having many problems compiling lpo, I finally got a .so file. The problem was that it was crashing when I was using the python version of anaconda, but it was working properly with the OSx native version of python. Looking what it was calling, I found:

>> otool -L     
     /Users/gonzalovaca1/caffe/caffe/selectiveSearch/lpo-release/build/lib/python/ (compatibility version 0.0.0, current version 0.0.0)
     /usr/local/lib/libboost_python-mt.dylib (compatibility version 0.0.0, current version 0.0.0)
     /System/Library/Frameworks/Python.framework/Versions/2.7/Python (compatibility version 2.7.0, current version 2.7.6)
     /usr/local/lib/libjpeg.8.dylib (compatibility version 13.0.0, current version 13.0.0)
     /usr/local/lib/libpng16.16.dylib (compatibility version 34.0.0, current version 34.0.0)
     /usr/lib/libz.1.dylib (compatibility version 1.0.0, current version 1.2.5)
     /usr/local/lib/gcc/4.9/libstdc++.6.dylib (compatibility version 7.0.0, current version 7.20.0)
     /usr/local/lib/gcc/4.9/libgomp.1.dylib (compatibility version 2.0.0, current version 2.0.0)
     /usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 1213.0.0)
     /usr/local/lib/gcc/4.9/libgcc_s.1.dylib (compatibility version 1.0.0, current version 1

The line:


was pointed to a different version of python. To change it, I used :

install_name_tool -change /Users/gonzalovaca1/anaconda/lib/libpython2.7.dylib libpython2.7.dylib

Now, the libpython is chosen according to the order of python in $DYLD_FALLBACK_LIBRARY_PATH, which in my case point to the anaconda first.

Even with that it was not working. The problem also came from libboost-python line:

/usr/local/lib/libboost_python-mt.dylib (compatibility version 0.0.0, current version 0.0.0)

Checking what libboost_python-mt was calling, I found:

otool -L /usr/local/lib/libboost_python-mt.dylib
    /usr/local/lib/libboost_python-mt.dylib (compatibility version 0.0.0, current version 0.0.0)
    /System/Library/Frameworks/Python.framework/Versions/2.7/Python (compatibility version 2.7.0, current version 2.7.6)
    /usr/lib/libstdc++.6.dylib (compatibility version 7.0.0, current version 104.1.0)
    /usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 1213.0.0)

which was also the system python version. Again using install_name_tool:

sudo install_name_tool -change /System/Library/Frameworks/Python.framework/Versions/2.7/Python libpython2.7.dylib  /usr/local/lib/libboost_python-mt.dylib

did the trick. Now is working from my Anaconda version, and also from the native python version when I set $DYLD_FALLBACK_LIBRARY_PATH in the right order






Posted in programming-python | Comments Off