Alexa, What's Today's Lesson?
Participate and share : Interactive lecture
Monday, June 24, 4:00–5:00 pm
Location: Room 124
AI and voice-first technology are changing everything, so why not education? Business educator and entrepreneur David Gaudet will show how to transform lessons into Alexa flash briefings and/or podcasts. You'll become Amazon developers and be ready to produce content for the fall semester, all accessible via voice command.
|Audience:||Teachers, Teacher education/higher ed faculty|
|Attendee devices:||Devices required|
|Attendee device specification:||Laptop: Mac, PC
|Participant accounts, software and other materials:||Attendees will need to access and have accounts established to access services on two different websites:
- https://developer.amazon.com/alexa/console/ask? (Amazon account)
Besides this, participants would benefit from having some idea of a concept which could be shared on a regular basis with their students.
|Focus:||Digital age teaching & learning|
|Topic:||Innovation in higher education|
|Grade level:||Community college/university|
|Subject area:||Career and technical education, Computer science|
|ISTE Standards:||For Administrators:
Digital Age Learning Culture
Digital Age Learning Environments
The ongoing and increasing challenge of learner engagement can often by tackled through the use of ubiquitous technology and digital tools. We have seen this over the years through integrating laptops, tablets, YouTube, Adobe etc into our courses and classrooms. The presenter (David Gaudet) of this session is no stranger to introducing commonplace tech into his lessons, leveraging the popularity and strength of facebook as early as 2007 as a means of engaging students.
Twelve years later, the only changes to student engagement is its ongoing decrease, proportional to the increase in distractions. Again, the "if you can't beat them, join them" mantra can apply.
Gaudet introduced voice-first technology in the form of an Amazon Alexa term project, in his "Innovation and Design" course, at Southern Alberta Institute of Technology. In it, students were tasked with identifying a problem which could be solved through the creation of an Alexa "Flash Briefing"; then designing the solution to the problem through the creation of a real life briefing, to be published for the world to hear.
This term-long project will be condensed to the allowable time (60-90 minutes) by first going through a process closely resembling the first four steps of Standord's Design Thinking model: empathize, ideate, define and prototype (participants will have to conduct their own tests after the fact). Tools, including Amazon Alexa and a free-trial hosting platform will be introduced to demonstrate the ease of entrance into this space.
By the end of the session, each participant will have an Amazon Developer Account, a hosting account, and an idea of how to use flash briefings as a digital supplement to their classroom teaching.
I. Intro: (0 - 10 minutes):
- learners will be introduced into the rapidly growing AI and voice assistant digital category.
- brief overview of Amazon Alexa and other market leaders will be covered, as well as a breakdown of Alexa terminology and processes.
II. Hands-On Account Set Up (10 - 20 minutes):
- participants will establish and sign into Amazon and hosting accounts
- Preliminary stages of setting up an Alexa Flash Briefing online through Amazon Developer Console.
III. Hands-On Flash Briefing Development (20-30 minutes)
- participants work on the content idea and value proposition of their flash briefing
- production values, recording techniques, best practices are discussed and demonstrated
The session facilitator will work the room to ensure attendees have any basic problems properly addressed so that they can get on with their project at hand.
AI/Voice assistant technology first started penetrating our everyday lives with Siri in 2011, but it wasn't until Amazon launched Alexa and Echo in 2014, and Google introduced Assistant in 2016, that the chasm of consumer adoption began to close.
Forecast to be a 48% growth rate from 2017-2018, according to Forbes (https://www.forbes.com/sites/johnkoetsier/2018/05/29/smart-speaker-users-growing-48-annually-will-outnumber-wearable-tech-users-this-year/#19cbb3305dde ), smart speaker sales increased a staggering 78% year over year (https://www.recode.net/2019/1/8/18173696/amazon-alexa-google-assistant-smart-speaker-sales-npr).
Smart speakers are the pipeline infrastructure to a coming tidal wave of content. Like anything else digital, the content creators can be anybody - including faculty providing content for students.
There is no end to commercial publication nor academic journals on the growing pervasiveness of this field. In "SmartSheet's" article, "How Voice Assistants are Changing the World", Tractica, is a market intelligence firm that focuses on human interaction with technology, makes the following forecasts:
"Unique consumer users for virtual digital assistants (which they define as automated software applications or platforms that assist the human user through understanding natural language in written or spoken form) will grow from more than 390 million worldwide users in 2015 to 1.8 billion by the end of 2021. The growth in the business world is expected to increase from 155 million users in 2015 to 843 million by 2021. With that kind of projected growth, revenue is forecasted to grow from $1.6 billion in 2015 to $15.8 billion in 2021. (https://www.smartsheet.com/voice-assistants-artificial-intelligence).
With such growth in how society interacts with this technology, and the underlying behavior associated which drives demand (ie: convenience), it only makes sense that it takes its place inside education as a teaching and learning tool. This session is designed to encourage faculty to do so.