WO2005022487A2 - System and method for language instruction - Google Patents
System and method for language instruction Download PDFInfo
- Publication number
- WO2005022487A2 WO2005022487A2 PCT/US2004/028216 US2004028216W WO2005022487A2 WO 2005022487 A2 WO2005022487 A2 WO 2005022487A2 US 2004028216 W US2004028216 W US 2004028216W WO 2005022487 A2 WO2005022487 A2 WO 2005022487A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- description
- event
- phrase
- text
- perspective
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B19/00—Teaching not covered by other main groups of this subclass
- G09B19/06—Foreign languages
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B5/00—Electrically-operated educational appliances
- G09B5/06—Electrically-operated educational appliances with both visual and audible presentation of the material to be studied
- G09B5/065—Combinations of audio and video presentations, e.g. videotapes, videodiscs, television systems
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B7/00—Electrically-operated teaching apparatus or devices working with questions and answers
Definitions
- This invention generally relates to foreign language instruction, and more particularly to systems and methods for teaching foreign language by using immersion techniques.
- Immersion techniques have been developed to improve the teaching and learning of foreign languages. Simply stated, immersion techniques involve "immersing" students in the foreign language, thereby attempting to replicate the process by which students learned their native language, but in a systematic way. Thus, students learn a language by listening to speakers of the language associate words with objects, concepts and actions.
- Immersion techniques recognize that most people learned their native language by observing those speaking the language, and that the learning of grammatical rules and the like came much later than fluency in the language. Thus, in language immersion techniques, memorization and the learning of grammatical rules are typically de-emphasized.
- Computer software programs have been developed which employ, at least to some extent, language immersion techniques. These programs can be utilized with conventional personal computers having both a visual display (such as a computer screen) and an audio component (such as a speaker). These programs conventionally employ some degree of multi-media presentation, with at least some interactivity with the user. For example, such programs may provide a still picture or video component in which an object is displayed or video footage of a certain event is presented on the computer screen. In conjunction with the visual display, the foreign language may be audibly presented. For example, when a picture of a dog is presented on the screen, the audio component of a German language instruction program may present the user with a recorded native speaker saying "der Hund.” In addition, the "script" of the recorded audio message may be presented on the screen. Thus, in a text box on the computer screen, the words "der Hund" may be presented.
- a translation of the foreign word(s) to the student's native language are often presented on the screen.
- the English words “the dog” may be provided in another text box on the computer screen when the German words “der Hund” are presented.
- Many language instruction programs provide some interactivity with the user to assist the user in learning the foreign language.
- One common technique is providing certain questions about the object or event depicted and the corresponding foreign language description.
- a video component may be presented showing an interaction involving asking for directions to the airport.
- a question may be presented to the user to assess the user's understanding of the foreign language - for example, "Is the airport on the left or right?" The user may be rewarded with points for correctly answering the question.
- a method of language instruction which comprises presenting a first description of an event responsive to a first perspective, and presenting a second description of the event responsive to a second perspective, wherein the first description of the event and the second description of the event are in a common language.
- the first and second descriptions of the event can be provided in a variety of formats. For example, they can be provided in audio format such that an audio recording of the first description is played and is followed by an audio recording of the second description.
- the first and second descriptions could be displayed as text, for example, on a computer screen.
- a video component could be provided that displays a video presentation of the event.
- the present invention provides a description presentation apparatus which may comprise a visual event presentation area, a first text presentation area, and a second text presentation area.
- the visual presentation apparatus may display a visual representation of an event in the visual event presentation area, a first text description of the event in the first text presentation area and a second text description of the event in the second text presentation area.
- the first text description of the event is responsive to a first perspective of the event and the second text description of the event is responsive to a second perspective of the event. Both the first text description of the event and the second text description of the event are in a common language. In other embodiments, additional text descriptions may be provided.
- the perspective from which the description of the event is provided may be the perspective of one of the participants of the event depicted in the visual event presentation area.
- Other perspectives may include a narration perspective or a dialogue perspective.
- FIG. 1 is a representation of a display apparatus according to an embodiment of the present invention.
- FIG. 2 is a representation of another display apparatus according to an embodiment of the present invention.
- FIG. 3 is a representation of another display apparatus according to an embodiment of the present invention, in which a glossary feature is shown.
- FIG. 4 is a block diagram of a computer system suitable for use with an embodiment of the present invention.
- An embodiment of the present invention overcomes this problem by presenting descriptions of the event to the user from at least two different perspectives.
- the user does not understand a particular word in the first description, he or she can go on the second description to see if he or she can understand that description.
- the second description may contain a synonym of that word which the user does understand.
- the second description may allow the user to understand the first description by providing some context to the first description. In this way, the user will come to rely on the second description (which is also in the foreign language) rather than the native language translation of the first passage.
- the availability of the second description increases the likelihood that the user will remain in the immersion context, rather than resort to translation.
- the use of descriptions from more than one perspective serves to deepen the user's overall understanding of the language. For example, providing descriptions from different perspectives allows the user to encounter a variety of synonyms for a particular object or idea. It also allows the user to encounter a variety of syntaxes and sentence structures so that the user can learn the various ways that speakers of a language will attempt to get across the same or similar ideas.
- the user may also encounter the same ideas expressed in different voices, for example, passive voice and active voice.
- the user may also see the different ways in which pronouns are used, or the differences between direct and indirect discourse, or the use of different tenses.
- FIG. 1 is a representation of a description presentation apparatus including a display apparatus 101 according to an embodiment of the present invention.
- Display apparatus 101 can be presented to a user who wishes to learn a language using a language immersion technique.
- the display apparatus 101 is a conventional computer screen such that video footage of an event may be displayed to the user.
- the current invention is not limited to the computer or "multi-media" context.
- display apparatus 101 may simply be a piece of paper, such that the items described below may be presented in a book.
- video could not be presented in such an embodiment, but still pictures of events could be. Either case is within the scope of this invention, as is any other medium that permits the displays contemplated by the present invention, including, but not limited to, cellular phones, personal digital assistants ("PDA's"), televisions, game consoles and the like.
- PDA's personal digital assistants
- Display apparatus 101 has a number of areas in which different items are displayed.
- One such area is visual event presentation area 102.
- visual event presentation area 102 a visual representation of an event is presented.
- Event is intended broadly to mean any action or series of actions, or any object, objects or the like that can be non-linguistically presented to a user.
- An event can be a series of interactions between two individuals, such as one individual asking another for directions to the airport.
- an "individual” can include any language using entity, including, for example, human beings as well as fanciful characters such as cartoon characters and the like.
- An event can also include a still picture of any object, such as an animal, one or more individuals, or an inanimate object.
- visual event presentation area 102 contains a visual representation of such an event, such as motion video footage of two individuals interacting, or a still image of an object.
- the user is provided with various controls over the video footage. For example, the user is able to rewind the video footage by clicking on button 103 in a manner apparent to one of ordinary skill in the art in view of the specification. Similarly, the user is able to play the video footage by clicking on button 104, pause the video footage by clicking on button 105, and fast-forward the video footage by clicking on button 106. Additionally, the user can go to a specific portion of the video footage by clicking on button 107 and dragging it to a specific portion of the footage in a manner well-known in the art. Timer 108 may be used to track specific points in time during the video footage.
- a specific example of the video footage that may be presented in visual event presentation area 102 may be footage of two individuals exiting an airplane. These individuals may then appear to engage each other in conversation, in which one shows the other his watch. The individuals then walk to the baggage claim area of the airport, pick-up their bags, and sit down.
- the display apparatus 101 has a first text presentation area 109 and a second text presentation area 110.
- Displayed in the first text presentation area 109 is a first text description 111 of the event shown in the visual event presentation area, which is told from a particular perspective. If the event presented was the event described above regarding the two individuals exiting an airplane, the event could be described from the perspective of one of those individuals. Thus, the first text description 111 could be a description of the interaction from the point of view of one of the participants of the event. Thus, the first text description 111 presented in the first text presentation area 10S could read:
- the second text presentation area 110 presents a second text description 112 of the event, told from a different perspective than the first text description 110.
- the second text description 112 could be told from the perspective of a narrator who is not depicted in the visual event presentation area, and could read:
- first text presentation area 109 and second text presentation area 110 are located adjacent to one another.
- the text presented in these areas line up, at least on a paragraph level.
- the first paragraph of first text description 111 and the first paragraph of second text description 112 begin substantially on the same line; similarly, the second paragraph of each text description begins on substantially the same line.
- the first paragraphs of each text description contain descriptions of essentially the same concepts.
- the first sentence of first text description 111 reads, "Ah, at last, we'd made it back home!"
- the first sentence of second text description in this example 112 reads, "Kate and Jim are back from their honeymoon.”
- the text descriptions of embodiments of the present invention need not have corresponding meanings on a sentence by sentence basis.
- the second sentences of these examples read respectively, "But where was our ride?" (from first text description 111) and "But they've arrived at the airport three hours late.” (from second text description 112).
- These sentences do not express essentially the same meaning in the way that the above-described first sentences of the example expressed essentially the same meaning.
- the text description need not have corresponding meanings on a sentence by sentence basis in embodiments they convey essentially corresponding ideas on a paragraph by paragraph basis.
- the paragraph as a whole in a text description conveys essentially the same meaning as the corresponding paragraph in a second text description.
- the first paragraphs in each text description convey the ideas that the individuals had arrived late from a trip and that they could not find the person who was supposed to pick them up.
- an audio component could be provided, such as by a recorded voice that is played over a conventional audio speaker.
- the voice of an individual reading the text presented in first text presentation area 109 could be provided in conjunction with the video presented in visual event presentation area 102.
- the text provided in the second text presentation area 110, rather than the first text presentation area 109 could be provided in audio form.
- an audio presentation of the first paragraph of first text presentation 111 could occur when the video shows the two individuals exiting the airplane and engaging each other in conversation. The video and audio could then be synchronized such that the audio presentation of the second paragraph occurs when the video shows the two individuals moving to the baggage claim area and retrieving their bags.
- the audio and video is synchronized such that the aspect of the event that is being described by the audio presentation corresponds with the aspect of the event that is shown in the video. Additionally, the sentence of the first text description 111 that is being presented by the audio component is highlighted (see highlighting 113) in order to assist the user in following along with the audio.
- the audio when audio is presented that is associated with a text description told from the point of view a particular participant in the event, the audio is voiced by a person of the same gender as the participant. This allows the user to hear words in the foreign language in diverse voices, thereby allowing the user to improve his or her understanding of the pronunciation of the foreign language.
- the text will automatically "scroll" such that the visible text corresponds to the audio portion that is being played.
- the text in the text presentation area that is not being presented in audio form scrolls in a similar manner, so that the two text presentation areas maintain their alignment, at least on a paragraph to paragraph level.
- button 119 may be provided. By clicking on button 119, the user can scroll through the text to get to a specific portion of the text, and the corresponding video and/or audio. Additionally, a user can also move to a particular point in the text (with corresponding video and/or audio) by simply clicking on the desired portion of the text within the text presentation area.
- first text presentation area 109 has a number of buttons 114-117. By clicking on one of the these buttons, the user can display the text associated with a particular perspective in first text presentation area.
- first text presentation area 109 By clicking on button 115, the user will display a different text description in first text presentation area 109. In this example the user will see a text description from the perspective of the other individual participating in the event. Thus, the following text description could appear in first text presentation area 109:
- this text description provides a description of the event shown in the video, and corresponds, at least on a paragraph by paragraph basis, to the general concepts in the first and second text descriptions 111 and 112.
- switching the text description in the first text presentation area 109 will switch the audio recording to correspond to that new text description, and the audio recording picks up the description at a point that roughly corresponds to where the previous audio portion left off.
- the audio track will begin on the second paragraph of the new text description (i.e., "At least our bags were there . . .
- dialogue can be a type of a "description" of an event.
- the dialogue perspective is essentially the direct dialogue between the participants in the event, similar to the script of a play.
- the direct dialogue text could be:
- this text description provides a description of the event shown in the video, and corresponds, at least on a paragraph by paragraph basis, to the general concepts in the other text descriptions.
- clicking button 117 the user will display the text description from the perspective of a third party narrator, as described above with respect to second text description 112.
- more than two text boxes may be viewable at a given time.
- a display apparatus 201 is provided that is similar to display apparatus 101.
- display apparatus 201 contains a visual event presentation area 202 and is provided with four text presentation areas 203, 204, 205, and 206.
- display apparatus 201 could simultaneously observe four text descriptions of the event told from four different perspectives.
- display apparatus 101 is provided with a question presentation area 118.
- Various questions may be presented to the user in question presentation area 118 to test the user's understanding of the event and the text and audio descriptions of the event. These questions may be presented in multiple choice, fill in the blank, true-false, or any other suitable format, which allows the user to input an answer, and the computer system to assess whether the answer is correct.
- the answer may be input by clicking on a single choice from a list of choices using a mouse or by using a computer keyboard as apparent to one of ordinary skill in the art in light of the specification.
- the ability to correctly answer at least some questions requires the user to understand at least two of the text descriptions.
- the question may ask the user to assess the response of one of the participants to a request posed by another of the participants. This may require the user to understand the text description told from the first participant's perspective as well as the text description told from the second participant's perspective.
- a dictionary or glossary feature may also be provided.
- the user may be able to click on that word, in a manner apparent to one of skill in the art in view of this specification, and receive a further explanation of the meaning of that word.
- FIG. 3 shows a display apparatus 301 similar to display apparatus 101. If the user does not understand the word "hour" from the various text descriptions available, he or she can click on that word, and text box 302 will appear providing further explanation of the word. In an embodiment, this explanation may be provided in the same language as the text description. The explanation could also contain a translation of the word into the user's native language.
- an embodiment of the present invention preferably comprises display apparatus 101 which has the ability to display a video of an event, as well as at least two text based descriptions of the event.
- an audio component provides a recording of a native speaker of the language speaking one of the text descriptions.
- this embodiment has a visual component, a text component, and an audio component.
- one or more of these components may be selectively turned off. For example, a user may wish to test his or her ability to understand simply the spoken language. Thus, he or she may turn off the text display or even the video, and attempt to understand simply what is heard from the audio component. Similarly, a user may simply wish to test his or her reading comprehension and turn off the audio and/or video portion, leaving the text displayed. Finally, a user may wish to test his or her ability to pronounce the foreign language and turn off the audio portion.
- the descriptions of the event may be presented in audio form only, such that the user is presented with an audio recording of a description of an event told from a first perspective.
- the user may be presented with an audio recording of the event told from a second perspective.
- This audio presentation may occur in conjunction with the display of a video representation of the event, but need not.
- a computer program product may be provided comprising a computer usable medium having computer logic recorded on the medium for instructing a computer to carry out the functions described above.
- the computer can be instructed to display the video footage and the text descriptions described above, as well as provide the audio associated with one of the text descriptions.
- the computer system 402 includes one or more processors, such as a processor 404.
- the processor 404 is connected to a communication bus 406.
- the computer system 402 also includes a main memory 408, preferably random access memory (RAM), and can also include a secondary memory 410.
- the secondary memory 410 can include, for example, a hard disk drive 412 and/or a removable storage drive 414, representing a floppy disk drive, a magnetic tape drive, an optical disk drive, etc.
- the removable storage drive 414 reads from and/or writes to a removable storage unit 418 in a well-known manner.
- the removable storage unit 418 represents a floppy disk, magnetic tape, optical disk, CD-ROM, etc. which is read by and/or written to by the removable storage drive 414.
- the removable storage unit 418 includes a computer usable storage medium having stored therein computer software and/or data.
- the secondary memory 410 may include other similar means for allowing computer programs or other instructions to be loaded into the computer system 402.
- Such means can include, for example, a removable storage unit 422 and an interface 420. Examples of such can include a program cartridge and cartridge interface (such as that found in video game devices), a removable memory chip (such as an EPROM, or PROM) and associated socket, and other removable storage units 422 and interfaces 420 which allow software and data to be transferred from the removable storage unit 422 to the computer system 402.
- the computer system 402 can also include a communications interface 424.
- the communications interface 424 allows software and data to be transferred between the computer system 402 and external devices. Examples of the communications interface 424 can include a modem, a network interface (such as an Ethernet card), a communications port, a PCMCIA slot and card, etc.
- Software and data transferred via the communications interface 424 are in the form of signals 426 that can be electronic, electromagnetic, optical or other signals capable of being received by the communications interface 424.
- Signals 426 are provided to communications interface via a channel 428.
- a channel 428 carries signals 426 and can be implemented using wire or cable, fiber optics, a phone line, a cellular phone link, an RF link and other communications channels.
- computer-readable storage medium is used to generally refer to media such as the removable storage device 418, a hard disk installed in hard disk drive 412, and signals 426. These media are means for providing software and operating instructions to the computer system 402.
- computer programs are stored in the main memory 408 and/or the secondary memory 410. Computer programs can also be received via the communications interface 424. Such computer programs, when executed, enable the computer system 402 to perform the features of the present invention as discussed herein. In particular, the computer programs, when executed, enable the processor 404 to perform the features of the present invention. Accordingly, such computer programs represent controllers of the computer system 402.
- the software may be stored in a computer-readable storage medium and loaded into the computer system 402 using the removable storage drive 414, the hard drive 412 or the communications interface 424.
- the control logic when executed by the processor 404, causes the processor 404 to perform the functions of the invention as described herein.
- the invention is implemented primarily in hardware using, for example, hardware components such as application specific integrated circuits (ASICs). Implementation of such a hardware state machine so as to perform the functions described herein will be apparent to persons skilled in the relevant art(s). In yet another embodiment, the invention is implemented using a combination of both hardware and software.
- ASICs application specific integrated circuits
- monitor 426 Also coupled to communications bus 406 is a monitor 426. As described above, in embodiments of the present invention, a variety of items are displayed on a display apparatus. A conventional computer monitor, such as monitor 426 can serve as such a display apparatus. Thus, the computer program will cause the processor to display video footage, text descriptions and the like on monitor 426 where they can be viewed by the user.
- audio device 428 is also coupled to communications bus 406.
- Audio device 428 can be any device that allows the computer system 402 to provide audible output, including for example, conventional speakers.
- the computer program will cause the processor to play the appropriate recorded audio track over the audio device 428.
- Input device 430 is coupled to communications bus 406.
- Input device 430 may be any number of devices that allow a user to input information into the computer system, including for example, a keyboard, a mouse, a microphone (for example interacting with voice recognition software), a stylus and the like.
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2006524943A JP2007504496A (en) | 2003-09-02 | 2004-08-31 | Language education system and method |
EP04782650A EP1673752A4 (en) | 2003-09-02 | 2004-08-31 | System and method for language instruction |
KR1020067004359A KR101118927B1 (en) | 2003-09-02 | 2006-03-02 | System and method for language instruction |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/652,620 | 2003-09-02 | ||
US10/652,620 US7524191B2 (en) | 2003-09-02 | 2003-09-02 | System and method for language instruction |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2005022487A2 true WO2005022487A2 (en) | 2005-03-10 |
WO2005022487A3 WO2005022487A3 (en) | 2005-12-29 |
Family
ID=34217693
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2004/028216 WO2005022487A2 (en) | 2003-09-02 | 2004-08-31 | System and method for language instruction |
Country Status (6)
Country | Link |
---|---|
US (1) | US7524191B2 (en) |
EP (1) | EP1673752A4 (en) |
JP (2) | JP2007504496A (en) |
KR (1) | KR101118927B1 (en) |
CN (1) | CN1842831A (en) |
WO (1) | WO2005022487A2 (en) |
Families Citing this family (46)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050089828A1 (en) * | 2003-10-02 | 2005-04-28 | Ahmad Ayaz | Language instruction methodologies |
US20050175970A1 (en) * | 2004-02-05 | 2005-08-11 | David Dunlap | Method and system for interactive teaching and practicing of language listening and speaking skills |
US20050181336A1 (en) * | 2004-02-17 | 2005-08-18 | Bakalian Kalust C. | System and method for learning letters and numbers of a foreign language |
US20070269775A1 (en) * | 2004-09-14 | 2007-11-22 | Dreams Of Babylon, Inc. | Personalized system and method for teaching a foreign language |
US8568144B2 (en) * | 2005-05-09 | 2013-10-29 | Altis Avante Corp. | Comprehension instruction system and method |
US8764455B1 (en) | 2005-05-09 | 2014-07-01 | Altis Avante Corp. | Comprehension instruction system and method |
US20060286527A1 (en) * | 2005-06-16 | 2006-12-21 | Charles Morel | Interactive teaching web application |
US20070026375A1 (en) * | 2005-08-01 | 2007-02-01 | Dewey Russell H | Electronic study aid and practice aid |
US20070111169A1 (en) * | 2005-08-15 | 2007-05-17 | University Of Southern California | Interactive Story Development System with Automated Goal Prioritization |
WO2007062529A1 (en) * | 2005-11-30 | 2007-06-07 | Linguacomm Enterprises Inc. | Interactive language education system and method |
US9002258B2 (en) | 2006-01-18 | 2015-04-07 | Dongju Chung | Adaptable audio instruction system and method |
US20070179788A1 (en) * | 2006-01-27 | 2007-08-02 | Benco David S | Network support for interactive language lessons |
US20120237906A9 (en) * | 2006-03-15 | 2012-09-20 | Glass Andrew B | System and Method for Controlling the Presentation of Material and Operation of External Devices |
US20070255570A1 (en) * | 2006-04-26 | 2007-11-01 | Annaz Fawaz Y | Multi-platform visual pronunciation dictionary |
US20080070216A1 (en) * | 2006-09-20 | 2008-03-20 | James Becker | Book having audio information |
US20080160487A1 (en) * | 2006-12-29 | 2008-07-03 | Fairfield Language Technologies | Modularized computer-aided language learning method and system |
TWI338269B (en) * | 2007-05-31 | 2011-03-01 | Univ Nat Taiwan | Teaching materials generation methods and systems, and machine readable medium thereof |
WO2008154542A1 (en) * | 2007-06-10 | 2008-12-18 | Asia Esl, Llc | Program to intensively teach a second language using advertisements |
US20090061398A1 (en) * | 2007-08-28 | 2009-03-05 | Gregory Keim | Language Teaching Method and Apparatus |
US20090197233A1 (en) * | 2008-02-06 | 2009-08-06 | Ordinate Corporation | Method and System for Test Administration and Management |
US20100248194A1 (en) * | 2009-03-27 | 2010-09-30 | Adithya Renduchintala | Teaching system and method |
US20110020774A1 (en) * | 2009-07-24 | 2011-01-27 | Echostar Technologies L.L.C. | Systems and methods for facilitating foreign language instruction |
US20110143323A1 (en) * | 2009-12-14 | 2011-06-16 | Cohen Robert A | Language training method and system |
US20110166862A1 (en) * | 2010-01-04 | 2011-07-07 | Eyal Eshed | System and method for variable automated response to remote verbal input at a mobile device |
US11562013B2 (en) * | 2010-05-26 | 2023-01-24 | Userzoom Technologies, Inc. | Systems and methods for improvements to user experience testing |
US20120122066A1 (en) * | 2010-11-15 | 2012-05-17 | Age Of Learning, Inc. | Online immersive and interactive educational system |
US9324240B2 (en) | 2010-12-08 | 2016-04-26 | Age Of Learning, Inc. | Vertically integrated mobile educational system |
KR101182675B1 (en) * | 2010-12-15 | 2012-09-17 | 윤충한 | Method for learning foreign language by stimulating long-term memory |
CN102253792A (en) * | 2011-01-12 | 2011-11-23 | 南昊(北京)科技有限公司 | Multi-picture interaction method |
US11062615B1 (en) | 2011-03-01 | 2021-07-13 | Intelligibility Training LLC | Methods and systems for remote language learning in a pandemic-aware world |
US10019995B1 (en) | 2011-03-01 | 2018-07-10 | Alice J. Stiebel | Methods and systems for language learning based on a series of pitch patterns |
US20140127653A1 (en) * | 2011-07-11 | 2014-05-08 | Moshe Link | Language-learning system |
US8784108B2 (en) * | 2011-11-21 | 2014-07-22 | Age Of Learning, Inc. | Computer-based language immersion teaching for young learners |
US8740620B2 (en) * | 2011-11-21 | 2014-06-03 | Age Of Learning, Inc. | Language teaching system that facilitates mentor involvement |
US9058751B2 (en) | 2011-11-21 | 2015-06-16 | Age Of Learning, Inc. | Language phoneme practice engine |
US9679496B2 (en) * | 2011-12-01 | 2017-06-13 | Arkady Zilberman | Reverse language resonance systems and methods for foreign language acquisition |
KR101193668B1 (en) * | 2011-12-06 | 2012-12-14 | 위준성 | Foreign language acquisition and learning service providing method based on context-aware using smart device |
US9207834B2 (en) | 2012-06-11 | 2015-12-08 | Edupresent Llc | Layered multimedia interactive assessment system |
US8761574B2 (en) | 2012-10-04 | 2014-06-24 | Sony Corporation | Method and system for assisting language learning |
CN104756181B (en) * | 2012-10-31 | 2017-10-27 | 日本电气株式会社 | Playback reproducer, setting device, back method and program |
US20140272820A1 (en) * | 2013-03-15 | 2014-09-18 | Media Mouth Inc. | Language learning environment |
JP6623575B2 (en) * | 2015-06-24 | 2019-12-25 | カシオ計算機株式会社 | Learning support device and program |
US11741845B2 (en) * | 2018-04-06 | 2023-08-29 | David Merwin | Immersive language learning system and method |
RU2688292C1 (en) * | 2018-11-08 | 2019-05-21 | Андрей Яковлевич Битюцкий | Method of memorizing foreign words |
AU2020222889A1 (en) * | 2019-02-11 | 2021-09-02 | Gemiini Educational Systems, Inc. | Verbal expression system |
US11875698B2 (en) | 2022-05-31 | 2024-01-16 | International Business Machines Corporation | Language learning through content translation |
Family Cites Families (131)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US591A (en) * | 1838-02-03 | Improved mill for breaking and crumbling lumps of sugar | ||
JPS5754792B2 (en) * | 1974-03-01 | 1982-11-19 | ||
US4067122A (en) * | 1975-10-17 | 1978-01-10 | Santiago Julio Fernandez | Tutor system |
US4128737A (en) | 1976-08-16 | 1978-12-05 | Federal Screw Works | Voice synthesizer |
US4333152A (en) * | 1979-02-05 | 1982-06-01 | Best Robert M | TV Movies that talk back |
US4247995A (en) * | 1979-07-09 | 1981-02-03 | Paul Heinberg | Language teaching method and apparatus |
US4406626A (en) * | 1979-07-31 | 1983-09-27 | Anderson Weston A | Electronic teaching aid |
US4406001A (en) * | 1980-08-18 | 1983-09-20 | The Variable Speech Control Company ("Vsc") | Time compression/expansion with synchronized individual pitch correction of separate components |
US4443199A (en) * | 1982-05-18 | 1984-04-17 | Margaret Sakai | Method of teaching the pronounciation and spelling and distinguishing between the written and spoken form of any language |
FR2546323B1 (en) | 1983-05-20 | 1985-08-30 | Tomatis Alfred | APPARATUS FOR TRAINING THE PRACTICE OF A MATERNAL OR FOREIGN LANGUAGE, WITH A VIEW TO ITS FULL ASSIMILATION |
US4710877A (en) | 1985-04-23 | 1987-12-01 | Ahmed Moustafa E | Device for the programmed teaching of arabic language and recitations |
US4613309A (en) * | 1985-09-19 | 1986-09-23 | Mccloskey Emily A | Figures of speech |
US4884972A (en) | 1986-11-26 | 1989-12-05 | Bright Star Technology, Inc. | Speech synchronized animation |
US5191617A (en) * | 1987-04-20 | 1993-03-02 | Intechnica International, Inc. | Voice interactive computer system |
IL84902A (en) * | 1987-12-21 | 1991-12-15 | D S P Group Israel Ltd | Digital autocorrelation system for detecting speech in noisy audio signal |
US5315689A (en) * | 1988-05-27 | 1994-05-24 | Kabushiki Kaisha Toshiba | Speech recognition system having word-based and phoneme-based recognition means |
US4891011A (en) * | 1988-07-13 | 1990-01-02 | Cook Graham D | System for assisting the learning of a subject |
GB8817705D0 (en) * | 1988-07-25 | 1988-09-01 | British Telecomm | Optical communications system |
US5065345A (en) | 1988-11-04 | 1991-11-12 | Dyned International, Inc. | Interactive audiovisual control mechanism |
US5010495A (en) * | 1989-02-02 | 1991-04-23 | American Language Academy | Interactive language learning system |
JP2797438B2 (en) | 1989-06-02 | 1998-09-17 | ソニー株式会社 | Group language learning device |
JP2990703B2 (en) * | 1989-08-22 | 1999-12-13 | ソニー株式会社 | Learning device |
US5453014A (en) * | 1989-10-20 | 1995-09-26 | Hendriks; Helga M. L. | Dynamic language training system |
US5203705A (en) * | 1989-11-29 | 1993-04-20 | Franklin Electronic Publishers, Incorporated | Word spelling and definition educational device |
CA2035348C (en) * | 1990-02-08 | 2000-05-16 | Jean-Louis Vignaud | Adjustable fastening device with spinal osteosynthesis rods |
JPH04158397A (en) * | 1990-10-22 | 1992-06-01 | A T R Jido Honyaku Denwa Kenkyusho:Kk | Voice quality converting system |
US5170362A (en) | 1991-01-15 | 1992-12-08 | Atlantic Richfield Company | Redundant system for interactively evaluating the capabilities of multiple test subjects to perform a task utilizing a computerized test system |
US5145376A (en) * | 1991-03-04 | 1992-09-08 | Krass Jennifer M | Teaching aid for foreign language |
US5197883A (en) * | 1991-11-26 | 1993-03-30 | Johnston Louise D | Sound-coded reading |
US5275569A (en) * | 1992-01-30 | 1994-01-04 | Watkins C Kay | Foreign language teaching aid and method |
US5486111A (en) * | 1992-01-30 | 1996-01-23 | Watkins; C. Kay | Foreign language teaching aid and method |
US5273433A (en) | 1992-02-10 | 1993-12-28 | Marek Kaminski | Audio-visual language teaching apparatus and method |
US5267865A (en) | 1992-02-11 | 1993-12-07 | John R. Lee | Interactive computer aided natural learning method and apparatus |
US5302132A (en) * | 1992-04-01 | 1994-04-12 | Corder Paul R | Instructional system and method for improving communication skills |
US5692906A (en) | 1992-04-01 | 1997-12-02 | Corder; Paul R. | Method of diagnosing and remediating a deficiency in communications skills |
US5293584A (en) * | 1992-05-21 | 1994-03-08 | International Business Machines Corporation | Speech recognition system for natural language translation |
US5717818A (en) * | 1992-08-18 | 1998-02-10 | Hitachi, Ltd. | Audio signal storing apparatus having a function for converting speech speed |
US5286205A (en) * | 1992-09-08 | 1994-02-15 | Inouye Ken K | Method for teaching spoken English using mouth position characters |
US5393236A (en) * | 1992-09-25 | 1995-02-28 | Northeastern University | Interactive speech pronunciation apparatus and method |
US6311157B1 (en) * | 1992-12-31 | 2001-10-30 | Apple Computer, Inc. | Assigning meanings to utterances in a speech recognition system |
US5487671A (en) * | 1993-01-21 | 1996-01-30 | Dsp Solutions (International) | Computerized system for teaching speech |
US5562453A (en) | 1993-02-02 | 1996-10-08 | Wen; Sheree H.-R. | Adaptive biofeedback speech tutor toy |
FR2702362B3 (en) * | 1993-02-24 | 1995-04-14 | Soprane Sa | Fixator for osteosynthesis of the lumbosacral spine. |
US5531745A (en) * | 1993-03-11 | 1996-07-02 | Danek Medical, Inc. | System for stabilizing the spine and reducing spondylolisthesis |
KR100309207B1 (en) * | 1993-03-12 | 2001-12-17 | 에드워드 이. 데이비스 | Speech-interactive language command method and apparatus |
US5421731A (en) * | 1993-05-26 | 1995-06-06 | Walker; Susan M. | Method for teaching reading and spelling |
US5340316A (en) * | 1993-05-28 | 1994-08-23 | Panasonic Technologies, Inc. | Synthesis-based speech training system |
US5557706A (en) * | 1993-07-06 | 1996-09-17 | Geist; Jon | Flexible pronunciation-practice interface for recorder/player |
JP3683909B2 (en) * | 1993-10-08 | 2005-08-17 | ロゴジンスキ,チェーム | Device for treating spinal conditions |
JP2594130Y2 (en) * | 1993-10-13 | 1999-04-19 | 株式会社学習研究社 | Foreign language conversation study book using images |
US5810599A (en) * | 1994-01-26 | 1998-09-22 | E-Systems, Inc. | Interactive audio-visual foreign language skills maintenance system and method |
JPH07219418A (en) * | 1994-01-31 | 1995-08-18 | Tadashi Yamaguchi | Portable voice generator and voice reproducing system utilizing it |
US5822720A (en) | 1994-02-16 | 1998-10-13 | Sentius Corporation | System amd method for linking streams of multimedia data for reference material for display |
US5529496A (en) * | 1994-03-28 | 1996-06-25 | Barrett; William | Method and device for teaching reading of a foreign language based on chinese characters |
US5540589A (en) * | 1994-04-11 | 1996-07-30 | Mitsubishi Electric Information Technology Center | Audio interactive tutor |
JP3005360U (en) * | 1994-06-17 | 1994-12-20 | 信子 横山 | Language learning textbook |
US5507746A (en) * | 1994-07-27 | 1996-04-16 | Lin; Chih-I | Holding and fixing mechanism for orthopedic surgery |
US5697789A (en) * | 1994-11-22 | 1997-12-16 | Softrade International, Inc. | Method and system for aiding foreign language instruction |
KR980700637A (en) * | 1994-12-08 | 1998-03-30 | 레이어스 닐 | METHOD AND DEVICE FOR ENHANCER THE RECOGNITION OF SPEECHAMONG SPEECH-IMPAI RED INDIVIDUALS |
FR2731344B1 (en) * | 1995-03-06 | 1997-08-22 | Dimso Sa | SPINAL INSTRUMENTATION ESPECIALLY FOR A ROD |
US5717828A (en) * | 1995-03-15 | 1998-02-10 | Syracuse Language Systems | Speech recognition apparatus and method for learning |
US5562661A (en) * | 1995-03-16 | 1996-10-08 | Alphatec Manufacturing Incorporated | Top tightening bone fixation apparatus |
WO1996037159A1 (en) * | 1995-05-22 | 1996-11-28 | Synthes Ag Chur | Clamp jaw for a spinal fixation device |
US6109923A (en) * | 1995-05-24 | 2000-08-29 | Syracuase Language Systems | Method and apparatus for teaching prosodic features of speech |
US6164971A (en) * | 1995-07-28 | 2000-12-26 | Figart; Grayden T. | Historical event reenactment computer systems and methods permitting interactive role players to modify the history outcome |
CN1117368C (en) * | 1995-07-31 | 2003-08-06 | 三星电子株式会社 | Autotropic/selflight-disc recording/reproducing language-larning information method |
US5772446A (en) * | 1995-09-19 | 1998-06-30 | Rosen; Leonard J. | Interactive learning system |
JP3820421B2 (en) * | 1995-10-03 | 2006-09-13 | 圭介 高森 | Learning device |
US5799276A (en) * | 1995-11-07 | 1998-08-25 | Accent Incorporated | Knowledge-based speech recognition system and methods having frame length computed based upon estimated pitch period of vocalic intervals |
US5729694A (en) * | 1996-02-06 | 1998-03-17 | The Regents Of The University Of California | Speech coding, reconstruction and recognition using acoustics and electromagnetic waves |
US5649826A (en) * | 1996-03-19 | 1997-07-22 | Sum Total, Inc. | Method and device for teaching language |
US5727951A (en) * | 1996-05-28 | 1998-03-17 | Ho; Chi Fai | Relationship-based computer-aided-educational system |
US5735693A (en) * | 1996-03-29 | 1998-04-07 | Multi Lingua Ltd.. | Method and device for learning a foreign language |
IL120622A (en) * | 1996-04-09 | 2000-02-17 | Raytheon Co | System and method for multimodal interactive speech and language training |
US5730603A (en) * | 1996-05-16 | 1998-03-24 | Interactive Drama, Inc. | Audiovisual simulation system and method with dynamic intelligent prompts |
US5845238A (en) * | 1996-06-18 | 1998-12-01 | Apple Computer, Inc. | System and method for using a correspondence table to compress a pronunciation guide |
JPH1011447A (en) * | 1996-06-21 | 1998-01-16 | Ibm Japan Ltd | Translation method and system based upon pattern |
US5766015A (en) * | 1996-07-11 | 1998-06-16 | Digispeech (Israel) Ltd. | Apparatus for interactive language training |
JP2871620B2 (en) * | 1996-09-06 | 1999-03-17 | 株式会社ロバート・リード商会 | Bone fixation device |
US5884263A (en) * | 1996-09-16 | 1999-03-16 | International Business Machines Corporation | Computer note facility for documenting speech training |
US5879350A (en) * | 1996-09-24 | 1999-03-09 | Sdgi Holdings, Inc. | Multi-axial bone screw assembly |
US5857173A (en) * | 1997-01-30 | 1999-01-05 | Motorola, Inc. | Pronunciation measurement device and method |
US6632096B1 (en) | 1997-05-26 | 2003-10-14 | Haruyuki Sumimoto | Method and apparatus for teaching and learning |
CA2286935C (en) * | 1997-05-28 | 2001-02-27 | Shinar Linguistic Technologies Inc. | Translation system |
US5920838A (en) * | 1997-06-02 | 1999-07-06 | Carnegie Mellon University | Reading and pronunciation tutor |
US6017219A (en) * | 1997-06-18 | 2000-01-25 | International Business Machines Corporation | System and method for interactive reading and language instruction |
JPH1138863A (en) * | 1997-07-17 | 1999-02-12 | Fuji Xerox Co Ltd | Language information apparatus |
US6163769A (en) | 1997-10-02 | 2000-12-19 | Microsoft Corporation | Text-to-speech using clustered context-dependent phoneme-based units |
DE69721278T2 (en) * | 1997-12-17 | 2004-02-05 | Robert Lange | Apparatus for stabilizing certain vertebrae of the spine |
US5957699A (en) * | 1997-12-22 | 1999-09-28 | Scientific Learning Corporation | Remote computer-assisted professionally supervised teaching system |
US5995932A (en) | 1997-12-31 | 1999-11-30 | Scientific Learning Corporation | Feedback modification for accent reduction |
USRE38432E1 (en) * | 1998-01-29 | 2004-02-24 | Ho Chi Fai | Computer-aided group-learning methods and systems |
US6134529A (en) | 1998-02-09 | 2000-10-17 | Syracuse Language Systems, Inc. | Speech recognition apparatus and method for learning |
US6517351B2 (en) * | 1998-02-18 | 2003-02-11 | Donald Spector | Virtual learning environment for children |
US6227863B1 (en) * | 1998-02-18 | 2001-05-08 | Donald Spector | Phonics training computer system for teaching spelling and reading |
US6077085A (en) * | 1998-05-19 | 2000-06-20 | Intellectual Reserve, Inc. | Technology assisted learning |
GB2338333B (en) * | 1998-06-09 | 2003-02-26 | Aubrey Nunes | Computer assisted learning system |
US6370498B1 (en) * | 1998-06-15 | 2002-04-09 | Maria Ruth Angelica Flores | Apparatus and methods for multi-lingual user access |
US6305942B1 (en) * | 1998-11-12 | 2001-10-23 | Metalearning Systems, Inc. | Method and apparatus for increased language fluency through interactive comprehension, recognition and generation of sounds, words and sentences |
DE19852896A1 (en) * | 1998-11-17 | 2000-05-18 | Alcatel Sa | Process for the automatic creation and monitoring of a schedule for a learning course by a computer |
US6275789B1 (en) * | 1998-12-18 | 2001-08-14 | Leo Moser | Method and apparatus for performing full bidirectional translation between a source language and a linked alternative language |
US6234802B1 (en) * | 1999-01-26 | 2001-05-22 | Microsoft Corporation | Virtual challenge system and method for teaching a language |
US6356865B1 (en) * | 1999-01-29 | 2002-03-12 | Sony Corporation | Method and apparatus for performing spoken language translation |
US6224383B1 (en) * | 1999-03-25 | 2001-05-01 | Planetlingo, Inc. | Method and system for computer assisted natural language instruction with distracters |
US6397185B1 (en) | 1999-03-29 | 2002-05-28 | Betteraccent, Llc | Language independent suprasegmental pronunciation tutoring system and methods |
US6397158B1 (en) * | 1999-04-29 | 2002-05-28 | Sun Microsystems, Inc. | Method for determining capacitance values for quieting noisy power conductors |
US6325630B1 (en) | 1999-06-09 | 2001-12-04 | Josef Grabmayr | Foreign language learning device and method |
US6438515B1 (en) * | 1999-06-28 | 2002-08-20 | Richard Henry Dana Crawford | Bitextual, bifocal language learning system |
US6468084B1 (en) | 1999-08-13 | 2002-10-22 | Beacon Literacy, Llc | System and method for literacy development |
JP2001075957A (en) * | 1999-08-24 | 2001-03-23 | Internatl Business Mach Corp <Ibm> | Display method and device for structure of natural language |
US6736642B2 (en) * | 1999-08-31 | 2004-05-18 | Indeliq, Inc. | Computer enabled training of a user to validate assumptions |
US6341958B1 (en) * | 1999-11-08 | 2002-01-29 | Arkady G. Zilberman | Method and system for acquiring a foreign language |
US6302695B1 (en) | 1999-11-09 | 2001-10-16 | Minds And Technologies, Inc. | Method and apparatus for language training |
US20010041328A1 (en) * | 2000-05-11 | 2001-11-15 | Fisher Samuel Heyward | Foreign language immersion simulation process and apparatus |
US6565358B1 (en) * | 2000-05-18 | 2003-05-20 | Michel Thomas | Language teaching system |
JP2001337595A (en) * | 2000-05-25 | 2001-12-07 | Hideo Ando | Language learning support program and constituting method for the program |
US6741833B2 (en) * | 2000-07-21 | 2004-05-25 | Englishtown, Inc. | Learning activity platform and method for teaching a foreign language over a network |
US6704699B2 (en) * | 2000-09-05 | 2004-03-09 | Einat H. Nir | Language acquisition aide |
US6782356B1 (en) * | 2000-10-03 | 2004-08-24 | Hewlett-Packard Development Company, L.P. | Hierarchical language chunking translation table |
JP2002171348A (en) * | 2000-12-01 | 2002-06-14 | Docomo Mobile Inc | System and method for providing audio information |
US6714911B2 (en) * | 2001-01-25 | 2004-03-30 | Harcourt Assessment, Inc. | Speech transcription and analysis system and method |
US7103534B2 (en) * | 2001-03-31 | 2006-09-05 | Microsoft Corporation | Machine learning contextual approach to word determination for text input via reduced keypad keys |
US20030006969A1 (en) * | 2001-05-16 | 2003-01-09 | Barras Bryan L. | System for one-touch bible book selection and viewing |
JP2003029619A (en) * | 2001-07-11 | 2003-01-31 | Lexpo:Kk | Language practice equipment |
US6729882B2 (en) * | 2001-08-09 | 2004-05-04 | Thomas F. Noble | Phonetic instructional database computer device for teaching the sound patterns of English |
US20030040899A1 (en) * | 2001-08-13 | 2003-02-27 | Ogilvie John W.L. | Tools and techniques for reader-guided incremental immersion in a foreign language text |
JP2003085093A (en) * | 2001-09-07 | 2003-03-20 | Mitsubishi Chemicals Corp | Information release system, device, method, and program and computer readable recording medium storing information release program |
JP2003150041A (en) * | 2001-11-07 | 2003-05-21 | Inventec Corp | Story interactive grammar teaching system and method |
US20030152904A1 (en) * | 2001-11-30 | 2003-08-14 | Doty Thomas R. | Network based educational system |
JP4032761B2 (en) * | 2002-02-06 | 2008-01-16 | 日本電気株式会社 | Education service providing server, educational content providing method, and program |
JP2002229442A (en) * | 2002-04-03 | 2002-08-14 | Akira Saito | System for learning foreign language based on dvd video |
-
2003
- 2003-09-02 US US10/652,620 patent/US7524191B2/en active Active
-
2004
- 2004-08-31 CN CNA2004800245520A patent/CN1842831A/en active Pending
- 2004-08-31 JP JP2006524943A patent/JP2007504496A/en active Pending
- 2004-08-31 WO PCT/US2004/028216 patent/WO2005022487A2/en active Application Filing
- 2004-08-31 EP EP04782650A patent/EP1673752A4/en not_active Withdrawn
-
2006
- 2006-03-02 KR KR1020067004359A patent/KR101118927B1/en active IP Right Grant
-
2011
- 2011-06-06 JP JP2011126162A patent/JP2011197691A/en active Pending
Non-Patent Citations (1)
Title |
---|
See references of EP1673752A4 * |
Also Published As
Publication number | Publication date |
---|---|
KR20060123718A (en) | 2006-12-04 |
JP2011197691A (en) | 2011-10-06 |
WO2005022487A3 (en) | 2005-12-29 |
US20050048449A1 (en) | 2005-03-03 |
KR101118927B1 (en) | 2012-03-09 |
JP2007504496A (en) | 2007-03-01 |
CN1842831A (en) | 2006-10-04 |
EP1673752A2 (en) | 2006-06-28 |
US7524191B2 (en) | 2009-04-28 |
EP1673752A4 (en) | 2008-10-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7524191B2 (en) | System and method for language instruction | |
Flowerdew et al. | The effect of discourse markers on second language lecture comprehension | |
Flowerdew et al. | Second language listening: Theory and practice | |
Church et al. | The role of gesture in bilingual education: Does gesture enhance learning? | |
Thanajaro | Using authentic materials to develop listening comprehension in the English as a second language classroom | |
Hismanoglu | An investigation of ELT students’ intercultural communicative competence in relation to linguistic proficiency, overseas experience and formal instruction | |
Smith et al. | Communicate: Strategies for international teaching assistants | |
Oprandy | Listening/speaking in second and foreign language teaching | |
AlShammari et al. | Building an interactive e-learning tool for deaf children: interaction design process framework | |
Murphey et al. | Intensifying practice and noticing through videoing conversations for self-evaluation | |
JP6656529B2 (en) | Foreign language conversation training system | |
Diehl et al. | The communication journey of a fully included child with an autism spectrum disorder | |
Mole | Deaf and multilingual | |
Al-Otaibi | The Effectiveness of Using Multimedia in Developing the English Language Listening Skills of Intermediate Female Students | |
Tyers | Role play and interaction in second language acquisition | |
Rahmat et al. | USING TALKING BOOKS TO IMPROVE THE SEVENTH GRADE STUDENTS’SPEAKING ACHIEVEMENT OF SMP MUHAMMADIYAH 11 ROGOJAMPI | |
SAIDOUNI et al. | Investigating teacher'and students' attituds towards the use of stotytelling to improve EFL learners' speaking skill | |
Carr | Micro-Lessons Through the Lens of Sociocultural Theory and Systemic Theoretical Instruction | |
HAFFERSSAS | The Impact of Implementing Same-Language Subtitled Audiovisual Material on EFL Learners’ Vocabulary Comprehension and Spelling | |
Lee | The process of intercultural communication through one-to-one videoconferencing in English education | |
Omonxo’jayeva | TEACHING LISTENING THROUGH ACTIVITIES IN PRIMARY SCHOOL | |
Abbade | Comparing and consonating: delving into english comparative and the sound of``L``: navigating between English grammar and phonetics | |
Molina | Developing Listening and Speaking Skills of Adult Learners: An Instructional Packet for ESL Teachers | |
Sew | DIGITISED IDENTITY OF MALAY LANGUAGE LEARNERS AT THE TERTIARY LEVEL: IDENTITI DIGITAL PELAJAR BAHASA MELAYU PADA PERINGKAT PENGAJIAN TINGGI | |
Sew | DIGITISED IDENTITY OF MALAY LANGUAGE LEARNERS AT THE TERTIARY LEVEL |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200480024552.0 Country of ref document: CN |
|
AK | Designated states |
Kind code of ref document: A2 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A2 Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2006524943 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020067004359 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 656/KOLNP/2006 Country of ref document: IN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2004782650 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 2004782650 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 1020067004359 Country of ref document: KR |