http://www.vox.com/2016/3/24/11299034/twitter-microsoft-tay-robot-hate-racist-sexist
https://www.tay.ai/
"The more you chat with Tay the smarter she gets, so the experience can be more personalized for you."
"Tay may use the data that you provide to search on your behalf. Tay may also use information you share with her to create a simple profile to personalize your experience. Data and conversations you provide to Tay are anonymized and may be retained for up to one year to help improve the service. Learn more about Microsoft privacy here."
대화를 나누는 상대의 정보가 테이에게 저장되고 개인화될 수 있다니, 익명으로 저장되더라도 비슷한 인간들이 비슷한 질문을 하면 비슷한 대답을 할 가능성이 높을 듯.
https://blogs.microsoft.com/blog/2016/03/25/learning-tays-introduction/
"As we developed Tay, we planned and implemented a lot of filtering and conducted extensive user studies with diverse user groups. We stress-tested Tay under a variety of conditions, specifically to make interacting with Tay a positive experience."
"It’s through increased interaction where we expected to learn more and for the AI to get better and better."
"The logical place for us to engage with a massive group of users was Twitter. Unfortunately, in the first 24 hours of coming online, a coordinated attack by a subset of people exploited a vulnerability in Tay."
"AI systems feed off of both positive and negative interactions with people. In that sense, the challenges are just as much social as they are technical."
"To do AI right, one needs to iterate with many people and often in public forums."
너무 어린 인공지능을 비정상인이 출몰하는 곳에 일찍 내보낸 듯.
인공지능이 사람과 상호작용하기 때문에 이 도전은 기술적일뿐 아니라 사회적이라고.
이세돌은 알파고를 3판의 대국으로 이해하고 빈틈을 찾아냈지만, 트위터의 병자들은 MS의 공지에서 바로 헛점을 파악한 듯.
기계도 사람도 시행착오를 통해서 배울 수 밖에.
초기 경험과 학습의 중요성도 비슷.
초기 경험과 학습의 중요성도 비슷.
댓글 없음:
댓글 쓰기