Menu

Timesdelhi.com

June 16, 2019
Category archive

Education

Amazon’s Alexa Fund invests in edtech startups, Zoobean and Unruly Studios

in Alexa/alexa fund/Amazon/Delhi/edtech/Education/funding/India/Politics/zoobean by

Two edtech companies, Zoobean and Unruly Studios, are the latest to receive investment from Amazon’s Alexa Fund — Amazon’s corporate venture fund focused on fueling innovation in voice technologies, and specifically, integrations with Amazon’s voice assistant Alexa. Zoobean, which you may recall from its Shark Tank appearance a few years ago, offers reading software to schools and libraries. And Unruly Studios, a 2018 Alexa Accelerator alum, connects STEM learning for kids with physical play.

Zoobean makes a software platform called Beanstack, which helps educators and librarians run reading challenges and learn from data-driven insights about participants’ progress.

Following its initial debut on Shark Tank, where it received investment from Mark Cuban, the company has continued to develop its software and expand its footprint. Today, Beanstack is licensed to over 1,200 public libraries and schools worldwide, and is often referred to as the “Fitbit for reading” because of how it tracks and rewards progress.

Along with Alexa Fund, Cuban has also re-invested in the platform, joined by EAI Technologies, Jo and Elizabeth Tango, Pamela Bass-Bookey and Harry Bookey, and Neil Jaffe of Booksource. The company didn’t share the round’s size, and Amazon declines to share the size of its Alexa Fund investments.

Zoobean is now exploring ways to integrate Alexa into Beanstack so readers can ask Alexa to track their progress or sending them reminders about reading time.

Unruly Studios, meanwhile, is approaching learning through play. Led by Bryanne Leeming, the team includes gaming veterans from Mattel, Nickelodeon, iRobot, Hasbro, and elsewhere, who have designed a product called Unruly Splats — a programmable floor tile that pairs with an app in order to allow kids to play recess-style games. For example, the tiles can be used for things like musical chairs, whack-a-mole, relay races and more.

The company participated in the Alexa Accelerator last year — in fact, they even connected Zoobean founder Felix Lloyd with the Alexa Fund, as it turns out. Unruly Studios is now exploring ways to connect its “Splats” with Alexa to make the experience more engaging, while also teaching kids programming fundamentals and voice design.

The Alexa Fund investment was part of its seed round of $1.8 million closed in April, also announced today. The round was led by eCoast Angels, and saw participation from new and existing investors including AT&T, Rough Draft Ventures powered by General Catalyst, TechStars, LearnLaunch, NextFab, and other New England angel funds.

“We were so impressed by Unruly Studios during their time with the Alexa Accelerator last summer, and we’re thrilled to be reinvesting here as part of their seed round,” said Paul Bernard, director of the Alexa Fund, about Amazon’s investment. “Their ability to combine STEM education with physical games is inspiring on its own, and we see voice as a way to make that experience even more fun and engaging. We can’t wait to see what they build in the future.”

These aren’t the Alexa Fund’s first investments in edtech, however. Last fall, the fund invested in voice-based software and services company Bamboo Learning, for example. It also invested in Sphero, which more recently pivoted to education with Sphero Edu.

“One of the reasons I’m so optimistic about voice technology is because it creates this communal experience where multiple people can share in the interaction,” said Mark Cuban, in a statement about the Alexa Fund investments. “Every startup founder should be looking at how voice services like Alexa fit into their business model, and it’s great to see companies like Zoobean and Unruly take that to heart. I’m excited to see them evolve their products and use voice to make reading and STEM accessible to more people.”

ObjectiveEd is building a better digital curriculum for vision-impaired kids

in Apps/Blindness/Delhi/Education/Gadgets/Gaming/Hardware/India/objectiveed/Politics/TC/visual impairment/visually impaired by

Children with vision impairments struggle to get a solid K-12 education for a lot of reasons — so the more tools their teachers have to impart basic skills and concepts, the better. ObjectiveEd is a startup that aims to empower teachers and kids with a suite of learning games accessible to all vision levels, along with tools to track and promote progress.

Some of the reasons why vision-impaired kids don’t get the education they deserve are obvious, for example that reading and writing are slower and more difficult for them than for sighted kids. But other reasons are less obvious, for example that teachers have limited time and resources to dedicate to these special needs students when their overcrowded classrooms are already demanding more than they can provide.

Technology isn’t the solution, but it has to be part of the solution, because technology is so empowering and kids take to it naturally. There’s no reason a blind 8-year-old can’t also be a digital native like her peers, and that presents an opportunity for teachers and parents both.

This opportunity is being pursued by Marty Schultz, who has spent the last few years as head of a company that makes games targeted at the visually-impaired audience, and in the process saw the potential for adapting that work for more directly educational purposes.

“Children don’t like studying and don’t like doing their homework,” he told me. “They just want to play video games.”

It’s hard to argue with that. True of many adults too for that matter. But as Schultz points out, this is something educators have realized in recent years and turned to everyone’s benefit.

“Almost all regular education teachers use educational digital games in their classrooms and about 20 percent use it every day,” he explained. “Most teachers report an increase in student engagement when using educational video games. Gamification works because students own their learning. They have the freedom to fail, and try again, until they succeed. By doing this, students discover intrinsic motivation and learn without realizing it.”

Having learned to type, point and click, do geometry and identify countries via games, I’m a product of this same process and many of you likely are as well. It’s a great way for kids to teach themselves. But how many of those games would be playable by a kid with vision impairment or blindness? Practically none.

Held back

It turns out that these kids, like others with disabilities, are frequently left behind as the rising technology tide lifts everyone else’s boats. The fact is it’s difficult and time consuming to create accessible games that target things like Braille literacy and blind navigation of rooms and streets, so developers haven’t been able to do so profitably and teachers are left to themselves to figure out how to jury-rig existing resources or, more likely, fall back on tried and true methods like printed worksheets, in-person instruction, and spoken testing.

And since teacher time is limited and instructors trained in vision impaired learning are thin on the ground, these outdated methods are also difficult to cater to an individual student’s needs. For example a kid may be great at math but lack directionality skills. You need to draw up an “individual education plan” (IEP) explaining (among other things) this and what steps need to be taken to improve, then track those improvements. It’s time-consuming and hard! The idea behind ObjectiveEd is to create both games that teach these basic skills and a platform to track and document progress as well as adjust the lessons to the individual.

How this might work can be seen in a game like Barnyard, which like all of ObjectiveEd’s games has been designed to be playable by blind, low vision, or fully sighted kids. The game has the student finding an animal in a big pen, then dragging it in a specified direction. The easiest levels might be left and right, then move on to cardinal directions, then up to clock directions or even degrees.

“If the IEP objective is ‘Child will understand left versus right and succeed at performing this task 90 percent of the time,’ the teacher will first introduce these concepts and work with the child during their weekly session,” Schultz said. That’s the kind of hands-on instruction they already get. “The child plays Barnyard in school and at home, swiping left and right, winning points and getting encouragement, all week long. The dashboard shows how much time each child is playing, how often, and their level of success.”

That’s great for documentation for the mandated IEP paperwork, and difficulty can be changed on the fly as well:

“The teacher can set the game to get harder or faster automatically, or move onto the next level of complexity automatically (such as never repeating the prompt when the child hesitates). Or the teacher can maintain the child at the current level and advance the child when she thinks it’s appropriate.”

This isn’t meant to be a full-on K-12 education in a tablet app. But it helps close the gap between kids who can play Mavis Beacon or whatever on school computers and vision-impaired kids who can’t.

Practical measures

Importantly, the platform is not being developed without expert help — or, as is actually very important, without a business plan.

“We’ve developed relationships with several schools for the blind as well as leaders in the community to build educational games that tackle important skills,” Schultz said. “We work with both university researchers and experienced Teachers of Visually Impaired students, and Certified Orientation and Mobility specialists. We were surprised at how many different skills and curriculum subjects that teachers really need.”

Based on their suggestions, for instance, the company has built two games to teach iPhone gestures and the accessibility VoiceOver rotor. This may be a proprietary technology from Apple but it’s something these kids need to know how to use, just like they need to know how to run a Google search, use a mouse without being able to see the screen, and other common computing tasks. Why not learn it in a game like the other stuff?

Making technological advances is all well and good, but doing so while building a sustainable business is another thing many education startups have failed to address. Fortunately, public school systems actually have significant money set aside specifically for students with special needs, and products that improve education outcomes are actively sought and paid for. These state and federal funds can’t be siphoned off to use on the rest of the class so if there’s nothing to spend them on, they go unused.

ObjectiveEd has the benefit of being easily deployed without much specialty hardware or software. It runs on iPads, which are fairly common in schools and homes, and the dashboard is a simple web one. Although it may eventually interface with specialty hardware like Braille readers, it’s not necessary for many of the games and lessons, so that lowers the deployment bar as well.

The plan for now is to finalize and test the interface and build out the games library — ObjectiveEd isn’t quite ready to launch, but it’s important to build it with constant feedback from students, teachers, and experts. With luck in a year or two the visually-impaired youngsters at a school near you might have a fun new platform to learn and play with.

“ObjectiveEd exists to help teachers, parents and schools adapt to this new era of gamified learning for students with disabilities, starting with blind and visually impaired students,” Schultz said. “We firmly believe that well-designed software combined with ‘off-the-shelf’ technology makes all this possible. The low cost of technology has truly revolutionized the possibilities for improving education.”

Global Learning Xprize splits $10M purse for best teaching app for disadvantaged kids

in Delhi/Education/global learning xprize/India/Politics/TC/XPRIZE by

Kids need a good education to have the best chance of succeeding in the world, but in distant parts of developing countries that may be neither schools nor teachers. The Global Learning Xprize aimed to spur innovation in the tech space to create app-based teaching those kids can do on their own — and a tie means the $10 million grand prize gets split in two.

The winners, Onebillion and Kitkit School, both created tablet apps that resulted in serious gains to literacy rates in the areas they were deployed. Each receives $5M, in addition to the $1M they got for being a finalist.

Funded by a number of sponsors including Elon Musk, the prize started way back in 2014. Overseen at first by Matt Keller (previously at the famous but sadly unsuccessful One Laptop Per Child program), and later by Emily Musil Church, the prize asked entrants to create free, open-source software that kids could use to teach themselves basic reading, writing, and arithmetic.

After soliciting teams and doing some internal winnowing of the herd, a set of five finalists was arrived at: CCI, Chimple, Kitkit School, Onebillion, and Robotutors. They came from a variety of locations and backgrounds, and as mentioned all received a $1M prize for getting to this stage.

These finalists were then subjected to field testing in Tanzania, where 8,000 Pixel C tablets generously donated by Google for the purpose were distributed to communities where teaching was hardest to come by and literacy rates lowest.

Among the participating kids, only about a quarter attended school, and only one in ten could read a single world in Swahili. By the end of the 15-month field test, 30 percent of the kids could read a complete sentence — results were even better among girls.

I asked about the field test process itself. Church, who led the prize project, gave a detailed answer that shows how closely the organization worked with local communities:

The field test was a very unique and complex operation – the field test included nearly 2,700 children and 170 villages in some of the most remote parts of Tanzania over the course of 15 months. XPRIZE worked closely with its partners on the ground to implement this unique 15-month field test – UNESCO, World Food Programme, and the Government of Tanzania. In total that required over 300 staff members in Tanzania from all levels – from the regional educational officials to village mamas — women from each village who have been empowered to ensure the smooth functioning of the test. This was truly a ground-up, community-driven operation. Logistically, this required identifying and sensitizing communities, conducting baseline and endline assessment of all the children prior to tablet distribution, installing solar charging stations in all of these villages for the tablets, and physical data collection and tablet distribution by our heroic Field Assistants on motorbikes (just to name a few of the critical activities).

Once the tablets were in the hands of the children – the general approach was to be very “hands-off” as we wanted to see whether or not the software itself was leading to learning gains. We instead relied on village mamas to create a safe environment in which a child can use the tablet when they chose to. In short – we realize that in order for this work to scale globally – hands-on instruction is hard to do.

The winning teams had similar approaches: gamify the content and make it approachable for any age or ability level. Rural Tanzania isn’t hurting literacy-wise because of a lack of worksheets. If these kids are going to learn, it needs to be engaging — like anywhere else, they learn best when they don’t realize they’re being taught.

Onebillion’s approach was to create a single but flexible long course that takes kids from absolutely zero reading knowledge to basic competency. “Onecourse is made of thousands of learning units, some could be on reading activities, some could be on numeracy activities — it’s a modular course, it’s built around the child’s day and adapts to their needs,” explained the company’s CTO, Jamie Stuart in a video about the team.

“When the child is not yet at a stage when they can read, the story can be played back to the child a bit like an audio book. When the child starts to be able to decode words we can offer them assistance, and then later on they can attempt to read the story by themselves.”

Kitkit School came from Sooinn Lee and her husband, both game developers (and plenty of others, of course). She points out that games are fundamentally built around the idea of keeping the player engaged. “Sometimes in education software, I see there is software too much focused on what to deliver and what is the curriculum, rather than how a child will feel during this learning experience,” she said in her team video.

“We create gamified learning with a mixture of high quality graphics, sound, interactions, so a child will feel they’re doing a really fun activity, and they don’t care if they’re learning or not, because it feels so good.”

All the finalists were on the ground in these communities working with the kids, so this wasn’t just an fire and forget situation. And if we’re honest, that may account partially for the gains shown by these kids.

After all, the main issue is a lack of resources, and while the tablets and curricula are a good way to bring learning to the kids, what matters most is that someone is bringing it at all. That said, pre-built fun learning experiences like this that can run on rugged, easily distributed hardware are definitely powerful tools to start with.

As for the communities involved — they won’t be left high and dry now that the testing is over. Church told me that there are plans to make the apps part of Tanzania’s education system:

Our UN partners on the ground (UNESCO and WFP) have worked hand-in-hand with the Government of Tanzania to develop a plan regarding how to continue to use the software (deployed in Tanzania as part of this project), the tablets in the project, and the solar stations installed. This plan will be implemented by the Government of Tanzania in late June in conjunction with UNESCO and WFP. Part of this plan is to get the content in all five of the applications approved to be part of the formal education system in Tanzania, so it can be integrated. We laud the foresight of Tanzania to see the value in tablet-driven learning as a way to reach all children.

And the devices themselves will stay put, or even be replaced. “The staff on the ground will work with the communities to ensure each child as part of this project receives up-to-date software and a new tablet,” Church wrote. “In addition our partners are actively working with communities to teach them how to maintain and continue to use the solar stations in their villages beyond this project.”

Google’s latest app, Rivet, uses speech processing to help kids learn to read

in Apps/Area 120/children/Delhi/Education/Google/India/kids/parents/Politics/reading/rivet/speech processing/voice by

Rivet, a new app from Google’s in-house incubator, wants to help children struggling to read. The app hails from Area 120 — Google’s workshop for experimental projects — and includes over 2,000 free books for kids as well as an in-app assistant that can help kids when they get stuck on a word by way of advanced speech technology.

For example, if the child is having difficulties with a word they can tap it to hear it pronounced or they can say it themselves out loud to be shown in the app which parts were said correctly and which need work.

There are also definitions and translations for over 25 languages included in the app, in order to help kids — and especially non-native speakers — to better learn reading.

For younger readers, there’s a follow-along mode where the app will read the stories aloud with the words highlighted so the child can match up the words and sounds. When kids grow beyond needing this feature, parents can opt to disable follow-along mode so the kids have to read for themselves.

While there are a number of e-book reading apps aimed at kids on the market today, Rivet is interesting for its ability to leverage advances in voice technology and speech processing.

Starting today on Android and (soon) iOS, Rivet will be able to offer real-time help to kids when they tap the microphone button and read the page aloud. If the child hits a word and starts to struggle, the assistant will proactively jump in and offer support. This is similar to how parents help children to read — as the child reaches a word they don’t know or can’t say, the parent typically corrects them.

Rivet says all the speech processing takes place on the device to protect children’s privacy and its app is COPPA-compliant.

When the child completes a page, they can see which words they read correctly, and which they still need to work on. The app also doles out awards by way of points and badges, and personalizes the experience using avatars, themes and books customized to the child’s interests and reading level.

Other surprises and games keep kids engaged with the app and continuing to read.

According to Rivet’s Head of Tech and Product Ben Turtel, the team wanted to work on reading because it’s a fundamental skill — and one that needs to be mastered to learn just about everything else.

“Struggling readers,” he says, “are unlikely to catch up and four times less likely to graduate from high school. Unfortunately, 64 percent of fourth-grade students in the United States perform below the proficient level in reading,” Turtel explains.

Rivet is not the first app from Google aimed at tackling reading. An app called Bolo offers a similar feature set, but is aimed at kids in India, instead.

While Bolo was not an Area 120 project, others from the incubator have focused on education like learn-to-code app Grasshopper, or used speech processing technology, like customer service phone system CallJoy.

Rivet was previously spotted in the wild during beta trials this year, but is now publicly available and a free download on both Google Play and the Apple App Store across 11 countries, including the U.S.

 

Non-invasive glucose monitor EasyGlucose takes home Microsoft’s Imagine Cup and $100K

in Delhi/Education/Gadgets/Hardware/imagine cup/India/Microsoft/MS Build 2019/Politics/Science/TC by

Microsoft’s yearly Imagine Cup student startup competition crowned its latest winner today: EasyGlucose, a non-invasive, smartphone-based method for diabetics to test their blood glucose. It and the two other similarly beneficial finalists presented today at Microsoft’s Build developers conference.

The Imagine Cup brings together winners of many local student competitions around the world with a focus on social good and, of course, Microsoft services like Azure. Last year’s winner was a smart prosthetic forearm that uses a camera in the palm to identify the object it is meant to grasp. (They were on hand today as well, with an improved prototype.)

The three finalists hailed from the U.K., India, and the U.S.; EasyGlucose was a one-person team from my alma mater UCLA.

EasyGlucose takes advantage of machine learning’s knack for spotting the signal in noisy data, in this case the tiny details of the eye’s iris. It turns out, as creator Brian Chiang explained in his presentation, that the iris’s “ridges, crypts, and furrows” hide tiny hints as to their owner’s blood glucose levels.

EasyGlucose presents at the Imagine Cup finals.

These features aren’t the kind of thing you can see with the naked eye (or rather, on the naked eye), but by clipping a macro lens onto a smartphone camera Chiang was able to get a clear enough image that his computer vision algorithms were able to analyze them.

The resulting blood glucose measurement is significantly better than any non-invasive measure and more than good enough to serve in place of the most common method used by diabetics: stabbing themselves with a needle every couple hours. Currently EasyGlucose gets within 7 percent of the pinprick method, well above what’s needed for “clinical accuracy,” and Chiang is working on closing that gap. No doubt this innovation will be welcomed warmly by the community, as well as the low cost: $10 for the lens adapter, and $20 per month for continued support via the app.

It’s not a home run, or not just yet: Naturally, a technology like this can’t go straight from the lab (or in this case the dorm) to global deployment. It needs FDA approval first, though it likely won’t have as protracted a review period as, say, a new cancer treatment or surgical device. In the meantime, EasyGlucose has a patent pending, so no one can eat its lunch while it navigates the red tape.

As the winner, Chiang gets $100,000, plus $50,000 in Azure credit, plus the coveted one-on-one mentoring session with Microsoft CEO Satya Nadella.

The other two Imagine Cup finalists also used computer vision (among other things) in service of social good.

Caeli is taking on the issue of air pollution by producing custom high-performance air filter masks intended for people with chronic respiratory conditions who have to live in polluted areas. This is a serious problem in many places that cheap or off-the-shelf filters can’t really solve.

It uses your phone’s front-facing camera to scan your face and pick the mask shape that makes the best seal against your face. What’s the point of a high-tech filter if the unwanted particles just creep in the sides?

Part of the mask is a custom-designed compact nebulizer for anyone who needs medication delivered in mist form, for example someone with asthma. The medicine is delivered automatically according to the dosage and schedule set in the app — which also tracks pollution levels in the area so the user can avoid hot zones.

Finderr is an interesting solution to the problem of visually impaired people being unable to find items they’ve left around their home. By using a custom camera and computer vision algorithm, the service watches the home and tracks the placement of everyday items: keys, bags, groceries, and so on. Just don’t lose your phone, since you’ll need that to find the other stuff.

You call up the app and tell it (by speaking) what you’re looking for, then the phone’s camera it determines your location relative to the item you’re looking for, giving you audio feedback that guides you to it in a sort of “getting warmer” style, and a big visual indicator for those who can see it.

After their presentations, I asked the creators a few questions about upcoming challenges, since as is usual in the Imagine Cup, these companies are extremely early stage.

Right now EasyGlucose is working well but Chiang emphasized that the model still needs lots more data and testing across multiple demographics. It’s trained on 15,000 eye images but many more will be necessary to get the kind of data they’ll need to present to the FDA.

Finderrr recognizes all the images in the widely used ImageNet database, but the team’s Ferdinand Loesch pointed out that others can be added very easily with 100 images to train with. As for the upfront cost, the U.K. offers a 500-pound grant to visually-impaired people for this sort of thing, and they engineered the 360-degree ceiling-mounted camera to minimize the number needed to cover the home.

Caeli noted that the nebulizer, which really is a medical device in its own right, is capable of being sold and promoted on its own, perhaps licensed to medical device manufacturers. There are other smart masks coming out, but he had a pretty low opinion of them (not strange in a competitor but there isn’t some big market leader they need to dethrone). He also pointed out that in the target market of India (from which they plan to expand later) isn’t as difficult to get insurance to cover this kind of thing.

While these are early-stage companies, they aren’t hobbies — though admittedly many of their founders are working on them between classes. I wouldn’t be surprised to hear more about them and others from Imagine Cup pulling in funding and hiring in the next year.

1 2 3 21
Go to Top