{"id":1428,"date":"2021-06-22T12:24:51","date_gmt":"2021-06-22T03:24:51","guid":{"rendered":"https:\/\/blog.testworks.co.kr\/en\/?p=1428"},"modified":"2021-06-23T16:52:23","modified_gmt":"2021-06-23T07:52:23","slug":"ai-data_community_meetup_sign_languge","status":"publish","type":"post","link":"https:\/\/blog.aiworkx.ai\/en\/ai-data_community_meetup_sign_languge\/","title":{"rendered":"A Special Meetup for Special People"},"content":{"rendered":"\n<p><\/p>\n\n\n\n<p><\/p>\n\n\n\n<figure class=\"wp-block-image size-large is-resized is-style-rounded\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/blog.testworks.co.kr\/en\/wp-content\/uploads\/sites\/3\/2021\/06\/\uc774\ucc3d\uc2e0-\uc18c\uc7a5.jpg\" alt=\"\" class=\"wp-image-1429\" width=\"163\" height=\"150\" srcset=\"https:\/\/blog.aiworkx.ai\/en\/wp-content\/uploads\/sites\/3\/2021\/06\/\uc774\ucc3d\uc2e0-\uc18c\uc7a5.jpg 652w, https:\/\/blog.aiworkx.ai\/en\/wp-content\/uploads\/sites\/3\/2021\/06\/\uc774\ucc3d\uc2e0-\uc18c\uc7a5-300x276.jpg 300w\" sizes=\"auto, (max-width: 163px) 100vw, 163px\" \/><\/figure>\n\n\n\n<p class=\"has-normal-font-size\"><strong>Changsin Lee l Tech Evangelist l Testworks<\/strong><\/p>\n\n\n\n<p><\/p>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-1 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\"><\/div>\n<\/div>\n\n\n\n<p><\/p>\n\n\n\n<p style=\"font-size:17px\">A special Meetup was organized by <a href=\"https:\/\/testworks.co.kr\/\">Testworks<\/a> for developers and innovators who are interested in creating social values through technology.&nbsp; The first Meetup event occurred on June 17<sup>th<\/sup>, 2021. Unlike other meetups which went entirely online, the Meetup had a physical address and held in a <a href=\"https:\/\/testworks.co.kr\/\">Testworks<\/a> conference room. Three people from three different companies presented their research results about using AI to help people with hearing impairment communicate.<\/p>\n\n\n\n<p><\/p>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-2 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\"><\/div>\n<\/div>\n\n\n\n<p><\/p>\n\n\n\n<div class=\"wp-block-image\"><figure class=\"aligncenter size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"624\" height=\"351\" src=\"https:\/\/blog.testworks.co.kr\/en\/wp-content\/uploads\/sites\/3\/2021\/06\/MeetUp_1st_event.jpg\" alt=\"\" class=\"wp-image-1430\" srcset=\"https:\/\/blog.aiworkx.ai\/en\/wp-content\/uploads\/sites\/3\/2021\/06\/MeetUp_1st_event.jpg 624w, https:\/\/blog.aiworkx.ai\/en\/wp-content\/uploads\/sites\/3\/2021\/06\/MeetUp_1st_event-300x169.jpg 300w\" sizes=\"auto, (max-width: 624px) 100vw, 624px\" \/><\/figure><\/div>\n\n\n\n<p><\/p>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-3 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\"><\/div>\n<\/div>\n\n\n\n<p><\/p>\n\n\n\n<p style=\"font-size:17px\">1. <strong>Sign language is naturally evolved<\/strong>: Contrary to my expectation, sign language is not invented by linguists, but naturally evolved just like any other natural language.<\/p>\n\n\n\n<p><\/p>\n\n\n\n<p style=\"font-size:17px\">2. <strong>Not directly mappable to a natural language:<\/strong>&nbsp;Korean sign language is totally different from the Korean language. Similarly, American sign language cannot be replaced easily with English words. You must learn each sign language just like you are learning a foreign language.<\/p>\n\n\n\n<p><\/p>\n\n\n\n<p style=\"font-size:17px\">3. <strong>More than hands<\/strong>: Hands are not the only media for expression. Equally important are NMS (non-manual signs) like facial expressions and body postures.<\/p>\n\n\n\n<p><\/p>\n\n\n\n<p style=\"font-size:17px\">4. <strong>Lots of dialects and slangs<\/strong>: Surprisingly, there are many, many dialects, slang words, and personal idiosyncrasies in a sign language just like a natural language.<\/p>\n\n\n\n<p><\/p>\n\n\n\n<p style=\"font-size:17px\">5. <strong>No standard written language<\/strong>: Unlike most natural languages, sign languages cannot be written down easily. In other words, there is no standard written language for sign language. This makes learning sign language difficult and training an AI system more challenging because you must capture the movements of signing in videos.<\/p>\n\n\n\n<p><\/p>\n\n\n\n<p style=\"font-size:17px\">6. <strong>High illiteracy:<\/strong>&nbsp;You might think that people with hearing impairment can read lips or words so a sign language interpreter might not be necessary. Unfortunately, due to the disability, educational opportunities were limited so there are many who could not read or write their native natural language. Also, there is a lot of people with hearing impairment who do not understand sign language either.<\/p>\n\n\n\n<p><\/p>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-4 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\"><\/div>\n<\/div>\n\n\n\n<p><\/p>\n\n\n\n<p style=\"font-size:17px\">There are 370,000 people with hearing impairment in Korea and only twelve percent understand written Korean language perfectly. With such a high illiteracy rate, sign language recognition through the help of AI has a huge social value. The goal is lofty but there are many technical challenges. The three talks showed some promising ongoing research results. Here is the summary:<\/p>\n\n\n\n<p><\/p>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-5 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\"><\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-6 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\"><\/div>\n<\/div>\n\n\n\n<p><\/p>\n\n\n\n<p style=\"font-size:18px\"><strong>Collecting Sign Language Key Points Using Multi-Cameras and 3D Data Augmentation<\/strong><\/p>\n\n\n\n<p style=\"font-size:17px\">Seokmin Yun l Data Management Team Manager l Testworks<\/p>\n\n\n\n<p><\/p>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\"><\/div>\n<\/div>\n\n\n\n<p><\/p>\n\n\n\n<div class=\"wp-block-image\"><figure class=\"aligncenter size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"624\" height=\"343\" src=\"https:\/\/blog.testworks.co.kr\/en\/wp-content\/uploads\/sites\/3\/2021\/06\/MeetUp_1st_event_\uc724\uc11d\ubbfc.jpg\" alt=\"\" class=\"wp-image-1431\" srcset=\"https:\/\/blog.aiworkx.ai\/en\/wp-content\/uploads\/sites\/3\/2021\/06\/MeetUp_1st_event_\uc724\uc11d\ubbfc.jpg 624w, https:\/\/blog.aiworkx.ai\/en\/wp-content\/uploads\/sites\/3\/2021\/06\/MeetUp_1st_event_\uc724\uc11d\ubbfc-300x165.jpg 300w\" sizes=\"auto, (max-width: 624px) 100vw, 624px\" \/><\/figure><\/div>\n\n\n\n<p><\/p>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-8 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\"><\/div>\n<\/div>\n\n\n\n<p><\/p>\n\n\n\n<p style=\"font-size:17px\">Collecting sign language data requires a special setup. Normal cameras cannot capture normal hand movements accurately due to slow shutter speed and occlusions can happen quite frequently. To collect good training data, five high speed cameras were used to capture the initial key points. Then 3D images were constructed which allowed to project key points from any angle or position. The setup required a careful calibration at the beginning and auto-annotation through <a href=\"https:\/\/github.com\/CMU-Perceptual-Computing-Lab\/openpose\">openpose<\/a> library was pivotal for the completion of the project.<\/p>\n\n\n\n<p><\/p>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-9 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\"><\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-10 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\"><\/div>\n<\/div>\n\n\n\n<p><\/p>\n\n\n\n<p style=\"font-size:18px\"><strong>Sign Language Recognition Using Multiple AI Systems<\/strong><\/p>\n\n\n\n<p style=\"font-size:17px\">Han-Mu Park l Senior Researcher l <a href=\"https:\/\/www.keti.re.kr\/main\/main.php\">KETI<\/a><\/p>\n\n\n\n<p><\/p>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-11 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\"><\/div>\n<\/div>\n\n\n\n<p><\/p>\n\n\n\n<div class=\"wp-block-image\"><figure class=\"aligncenter size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"624\" height=\"342\" src=\"https:\/\/blog.testworks.co.kr\/en\/wp-content\/uploads\/sites\/3\/2021\/06\/KETI_\ubc15\ud55c\ubb34-\uc5f0\uad6c\uc6d0-\ubc1c\ud45c_3.png\" alt=\"\" class=\"wp-image-1432\" srcset=\"https:\/\/blog.aiworkx.ai\/en\/wp-content\/uploads\/sites\/3\/2021\/06\/KETI_\ubc15\ud55c\ubb34-\uc5f0\uad6c\uc6d0-\ubc1c\ud45c_3.png 624w, https:\/\/blog.aiworkx.ai\/en\/wp-content\/uploads\/sites\/3\/2021\/06\/KETI_\ubc15\ud55c\ubb34-\uc5f0\uad6c\uc6d0-\ubc1c\ud45c_3-300x164.png 300w\" sizes=\"auto, (max-width: 624px) 100vw, 624px\" \/><\/figure><\/div>\n\n\n\n<p><\/p>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-12 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\"><\/div>\n<\/div>\n\n\n\n<p><\/p>\n\n\n\n<p style=\"font-size:17px\"><a href=\"https:\/\/www.keti.re.kr\/main\/main.php\">KETI<\/a> developed its own Korean Sign Language recognition engine. The collection process differs from <a href=\"https:\/\/testworks.co.kr\/\">Testworks<\/a>\u2019 in that it used three ZED cameras but had to overcome similar technical challenges like pose estimation and occlusion. The pilot service project, however, showed promising results and used in Kimpo International Airport as a dedicated kiosk for sign language interpretation service. The current approach they took is to translate each sentence which is not scalable. The next version is to break it down to morpheme-level and enable dynamic composition of sentences.<\/p>\n\n\n\n<p><\/p>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-13 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\"><\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-14 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\"><\/div>\n<\/div>\n\n\n\n<p><\/p>\n\n\n\n<p style=\"font-size:18px\"><strong>Avatar Sign Language Generation Using AI<\/strong><\/p>\n\n\n\n<p style=\"font-size:17px\">Mathew Huerta-Enochian l AI Developer l <a href=\"https:\/\/www.eq4all.co.kr\/?lang=en\">EQ4ALL<\/a><\/p>\n\n\n\n<p><\/p>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-15 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\"><\/div>\n<\/div>\n\n\n\n<p><\/p>\n\n\n\n<div class=\"wp-block-image\"><figure class=\"aligncenter size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"624\" height=\"351\" src=\"https:\/\/blog.testworks.co.kr\/en\/wp-content\/uploads\/sites\/3\/2021\/06\/EQ4ALL_Mathew_\ubc1c\ud45c.png\" alt=\"\" class=\"wp-image-1433\" srcset=\"https:\/\/blog.aiworkx.ai\/en\/wp-content\/uploads\/sites\/3\/2021\/06\/EQ4ALL_Mathew_\ubc1c\ud45c.png 624w, https:\/\/blog.aiworkx.ai\/en\/wp-content\/uploads\/sites\/3\/2021\/06\/EQ4ALL_Mathew_\ubc1c\ud45c-300x169.png 300w\" sizes=\"auto, (max-width: 624px) 100vw, 624px\" \/><\/figure><\/div>\n\n\n\n<p><\/p>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-16 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\"><\/div>\n<\/div>\n\n\n\n<p><\/p>\n\n\n\n<p style=\"font-size:17px\">While Testworks and KETI focused on the collection and recognition aspect of sign language, <a href=\"https:\/\/www.eq4all.co.kr\/?lang=en\">EQ4ALL<\/a> is working on the generation of sign language through Avatars. Leveraging the latest advancement in Deep Learning, especially Transformer-based NLP models, <a href=\"https:\/\/www.eq4all.co.kr\/?lang=en\">EQ4ALL<\/a> turned generation of sign language from text into a neural machine translation problem. Using an attention-based encoder and decoder model, the model can generate real-time sign languages through an Avatar. The traditional symbolic methods (template-based and rule-based methods) are still used as a fallback mechanism, but the main work force is the attention-based neural machine translation.<\/p>\n\n\n\n<p><\/p>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-17 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\"><\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-18 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\"><\/div>\n<\/div>\n\n\n\n<p><\/p>\n\n\n\n<p style=\"font-size:17px\">Like any other AI projects, lack of sign language data is the biggest challenge that KETI and EQ4ALL face right now, and they are happy to see the quality data that <a href=\"https:\/\/testworks.co.kr\/\">Testworks<\/a> was able to deliver.<\/p>\n\n\n\n<p><\/p>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-19 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\"><\/div>\n<\/div>\n\n\n\n<p><\/p>\n\n\n\n<p style=\"font-size:17px\">More Meetups are scheduled for the future. If I take myself as an example, engineers have this nagging feeling in their minds wondering whether their work has any impact on other people. The forum might be a great vehicle for like-minded engineers. It was a small beginning, but it takes just a small spark to start a fire that burnt up the whole forest. I certainly saw a few scintillating lights tonight.<\/p>\n\n\n\n<p><\/p>\n\n\n\n<p><\/p>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-20 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\"><\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-21 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\"><\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-22 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\"><\/div>\n<\/div>\n\n\n\n<p><\/p>\n","protected":false},"excerpt":{"rendered":"<p>First Offline Meetup for Social Impact<\/p>\n","protected":false},"author":1,"featured_media":1434,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[5],"tags":[372,373,374,375,377,378,379,113,382,376,138,383,380,381],"class_list":["post-1428","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-news","tag-ai-data-community-meetup","tag-ai-developer-community","tag-ai-sign-language-data","tag-avatar","tag-deep-learning","tag-eq4all","tag-keti","tag-nlp","tag-quality-dataset","tag-sign-language-recognition-engin","tag-testworks","tag-testworks-blog","tag-transformer-based-nlp-models","tag-vision-keypoint"],"_links":{"self":[{"href":"https:\/\/blog.aiworkx.ai\/en\/wp-json\/wp\/v2\/posts\/1428","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/blog.aiworkx.ai\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blog.aiworkx.ai\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blog.aiworkx.ai\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/blog.aiworkx.ai\/en\/wp-json\/wp\/v2\/comments?post=1428"}],"version-history":[{"count":5,"href":"https:\/\/blog.aiworkx.ai\/en\/wp-json\/wp\/v2\/posts\/1428\/revisions"}],"predecessor-version":[{"id":1440,"href":"https:\/\/blog.aiworkx.ai\/en\/wp-json\/wp\/v2\/posts\/1428\/revisions\/1440"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/blog.aiworkx.ai\/en\/wp-json\/wp\/v2\/media\/1434"}],"wp:attachment":[{"href":"https:\/\/blog.aiworkx.ai\/en\/wp-json\/wp\/v2\/media?parent=1428"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blog.aiworkx.ai\/en\/wp-json\/wp\/v2\/categories?post=1428"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blog.aiworkx.ai\/en\/wp-json\/wp\/v2\/tags?post=1428"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}