Sunday, July 30, 2017

Article: If You're Fluent In Emoji, Does That Technically Make You Bilingual?

The article for this week is about an interview with Vyvyan Evans, Author of the book titled “The Emoji Code: The Linguistics Behind Smiley Faces and Scaredy Cats”. In the article the author discusses the concept of using Emoji icons as a second language when typing or as Vyvyan puts it “code switching" — a.k.a. flip-flopping between two languages in the course of a single sentence or thought” (Kiefer, 2017). Emoji icons have made the leap from traditional text messaging to another form of visual language easily understood regardless of native language. It is a pictorial code, which symbolizes so many concepts in text-based discussions. There are many variants within cultures for each of the Emoji however many are very uniformly understood, “it doesn’t really matter what your native tongue is. English, French, Japanese: A smiley means the same thing in most languages” (Kiefer, 2017). What I find interesting about this article is that it points out something that can be used with English Language Learners to help bridge the communication gap. I hadn’t given much though to it before but this seems like a natural way to quickly gain understanding. Most users of handheld technology will instantly be able to communication with the pictorial augmentation of Emoji icons.   

Kiefer, E. (2017, July 24). If You're Fluent In Emoji, Does That Technically Make You Bilingual? Retrieved July 30, 2017, from http://www.refinery29.com/2017/07/164639/history-of-emojis-vyvan-evans-book

Sunday, July 23, 2017

Article: ELL Specialist: These are “My Tech Essentials”

The article for this week discusses from an administrator’s perspective on how ELL specialists view the necessary technology tools within their schools classrooms. This administrator has a unique view regarding technology in the classroom because before being an administrator Mr. Maurizio was an ELL teacher. Three technology tools he mentions are iPads, Laptops and Lightspeed Redcat audio systems.

iPads: serve as translation tools through the Google translate feature, and change the keyboard to show the corresponding English letter to the native character to aid in language acquisition.

Laptops: Each teacher is equipped to leverage lesson-planning, presentations, and visuals. Images key to making the connections in the student’s language learning.

Lightspeed Redcat audio systems: Every classroom has this software in the school district. The system amplifies the teachers voice to allow the teacher to speak and students hear them clearly.

The reason I chose to discus this article is because I had never heard of Lightspeed Redcat. I did some additional research and found that it is a system which the teacher wears a microphone (on a lanyard around their neck) and teach class as normally they would. What the microphone does is broadcasts their voice to a speaker panel which amplifies their voice. What I found interesting is not the teacher side of speaking into a microphone but that the system has additional features, which include station microphones at the students desks allowing the students to be heard by other students as well. I always have found I would need to repeat not what I say but typically what the students respond to when questioned because the other students would not be able to hear them. This system might be a solution. It does not seem to be too invasive to the learning process.

MAURIZIO, T. (2017, July 19). ELL Specialist: These are “My Tech Essentials”. Retrieved July 23, 2017, from https://www.eschoolnews.com/2017/07/21/curriculum-tech-essentials/

Monday, July 17, 2017

Article: K-12: Sight-Words are Hoax Words

The article I have chosen this week is written about sight words. I have heard of the term “sight words” before however had really no contact with them in my teaching since it was prevalent with my children in elementary school. By reading the article I learning that sight words are word which students memorize based upon the word itself and not by reading the letters within the word. The author discusses how by using sight words it holds back literacy mastery and creates continued issues with literacy learning.

The author’s opinion is that these sight words and blending them with other literacy strategies like phonics still falls short. Reading has a lot to do with the relationship that words have with other words around them. Equating student literacy learning of that of a person trying to read a foreign language. They can read a few words within a sentence but the overall context and meaning of the statement is lost. The reader understands a few words but not what the words are trying to say together.

I found the article interesting because it made me think about other languages, which do not use letters. Chinese words for example, use a symbol that is similar in concept as sight words. I am curious how difficult it is for a Chinese student to pick up on how to read a letter based language and if a ESL student in this situation would thrive or fall behind in learning English.

Price, B. (2017, July 16). K-12: Sight-Words are Hoax Words. Retrieved July 17, 2017, from http://canadafreepress.com/article/k-12-sight-words-are-hoax-words

Sunday, July 2, 2017

Article: This Startup Wants Translate Your Own Voice into Another Language that You Can’t Speak

The article I selected this week is interesting because I believe technology is going to begin to break down communication barriers. BabelOn, a startup company in California, is creating software will translate English into another language. Other software/translators are out there which provide similar services. However, what is different about BabelOn is the ability to recreate the statements in the same speech as the speaker. The intent is to personalize conversations over Skype, for movies, or other digital conversations and translate them. Other companies currently offer the translating but in a computer voice. The user needs to start by creating a scripted recording which can be hours long to set a baseline of sounds. The baseline is used to create the modeling for the voice over for all the sounds used in the translator. The company has eight languages the software can translate.

Currently, the software needs to receive a script and translate it, which can take hours. Eventually with hardware advancements the goal is to get the software to recognize the spoken work and translate it in real time. There are a number of security concerns as well regarding voice recognition and security. The company is working on safeguards to establish protocols to avoid any compromising of users voices.

I find this interesting for the sake of education where now a presentation digitally recorded can be translated and still harness the authentic voice of the instructor. The personalization of the actual voice of the individual providing the teaching makes for a more authentic and engaged learning experience.

Bavor, S. (2017, July 02). This Startup Wants Translate Your Own Voice into Another Language that You Can’t Speak. Retrieved July 02, 2017, from http://trendintech.com/2017/07/02/this-startup-wants-translate-your-own-voice-into-another-language-that-you-cant-speak/

Sunday, June 25, 2017

Article: The New Digital Divide

The chosen article this week has to do with digital literacy and teachers. Over the past number of years the focus was on infrastructure and access to technology in schools. Now schools are typically well equipped close to a 99% connectivity rate for all public schools having a connection to the internet and significant funding increases to grow and maintain digital access. However, according to the article, there is still an issue districts need to overcome, there is a measurable skills gap which needs to be addressed with teachers and digital literacy.

The deficiency in digital literacy is not due necessarily to the lack of teacher drive to learn and assimilate technology but a lack in standardized digital-literacy curriculum and training for the students and teachers. Teachers are typically training through a mixed variety of resources and tools without guidance for quality and grade appropriateness resulting in unequal instruction from classroom to classroom. What districts and faculty are now looking for is to adopt or create standardized curriculum and training to implement so the students and teachers are all brought up to a basic level of digital literacy to enable success.

Oelrich, K. (2017, June 23). The New Digital Divide. Retrieved June 25, 2017, from https://www.languagemagazine.com/2017/06/new-digital-divide/

Sunday, June 18, 2017

Article: How Your Brain Understands Visual Language

The article I selected this week has a throwback to my design classroom. The article is about how people interpret and make associations to things they are familiar with in communication when objects, symbols, or sounds may be used to reinforce or clarify what is being communicated to a viewer. Through life experienced associations people can interpret information based upon what prior knowledge they have about a subject and how that prior knowledge can be used to assist in creating understanding. The author discusses how we have communication centered on literacy and numeracy but not as prevalently recognized is communication with graphics, referred to as graphicacy. In the author’s view,“ understanding and conception of the visual—becomes of equal importance with that of literacy and numeracy in today’s learning process” (Kahane, 2017). Graphics can carry meaning beyond the typical initial literal word representing the idea. Symbols representing icons like an old phone handset from a rotary dial phone are symbols still used today and are understood because of their historical and cultural origins and tend to transcend time. Even though the phone no longer looks like the symbol it is still recognized and understood.

Kahane, J. (2017, April 28). How Your Brain Understands Visual Language. Retrieved June 18, 2017, from https://www.fastcodesign.com/3047340/how-your-brain-understands-visual-language

Sunday, June 11, 2017

Article: Augmenting Ability: Microsoft Using AI, Smart Glass Tech to Aid Differently-Abled

My alert for this week is not so much about English as a Second Language but something that I think will become a big deal in the future, Artificial Intelligence and communication. The article I read discusses a collaboration with Microsoft and the Chinese Science Academy and Peiching Union University who have created an augmented reality prototype to translate spoken word to sign language and sign language to spoken word through a smart phone or smart glass app. The article cites Microsoft’s continued focus on breaking down barriers with assistive technology like that which Microsoft integrated in its OneNote software for dictation and text-to-speech. It will be very interesting to see how Artificial Intelligence could begin to assist in ways never thought possible before. 

Sunday, June 4, 2017

Article: Why the 10th Birthday Is Critical for Predicting the Career Paths of Immigrant Students

An article came through my alerts, which I found to be interesting to me, has to do with at what age ESL children determine their career pathways. Students who have immigrated into the United States around and after the age of 10 years (which is the age of my son currently) tend to focus on math, engineering and physical based careers. This seems to be due to the development of language skills in children which peak prior to 10 years old. After 10 years old students do not have the language learning interest to develop new language communication with others rather, concentrating on learning language skills for the purpose of learning information. It is interesting to find things like this, which lend to the root cause of how students perceive information and reading and how it determines what they choose to focus on in their learning.