Category
Theme

Following our previous installment, we present the discussion between Associate Professor Yasue Mitsukura of Keio University and Toshitaka Kamiya of Dentsu ScienceJam Inc. (later joined by Kana Nakanaka for a roundtable). This time, they discussed the origins of their brainwave research, their encounter with the necomimi team, and their future outlook.

Can we understand people's minds from brainwaves?

Kamiya: What initially sparked Professor Mitsukura's research?

Mitsukura: When I started research at university , I wanted to build a Gundam (laughs). I wondered, "What does building a Gundam even mean? How could humans operate something that big?" That led me to think it was about control. I believed control was the future.

Kamiya: So control is essential for building Gundam?

Mitsukura: Yes . After graduating, I joined Tokushima University as a research assistant. My professor at the time was researching image recognition and speech recognition. Speech recognition, while possible now with things like Siri (the voice recognition user interface for iOS), couldn't even recognize basic sounds like "a," "i," "u," "e," and "o" back then. Both speech recognition, image recognition, and control involve waves. You extract features from those waves and create patterns. This is also one of the current approaches to brainwave research.

While researching how to control these waves, my professor posed a question: "Could we understand a person's mind through brainwaves?" He showed me this huge headgear-like EEG machine, and I thought, "Is this professor a bit dangerous?" (laughs). Still, I got interested, joined the experiment, and spent hours moving beans around. Doing that, wave patterns started to emerge. I'd notice things like, "Hmm, this feels different from yesterday," or "I'm not feeling well today."

As patterns became visible, I started thinking they could be converted into mathematical formulas. That led to even more patterns emerging, and I began to recognize things like, "This pattern means they're concentrating." The moment I had the flash of insight that observing patterns was the right approach became the key point in tackling biological signals, including brainwaves.

Kamiya: So no one else had been doing the same kind of work as you before?

Mitsukura: That's right . Everyone was focused on alpha waves and beta waves.

Kamiya: Now you infer emotions from brainwaves, but are there other research areas using signal processing?

Mitsukura: First , there's image processing using signals. For example, there's an avatar system, an imitation system where you move a character on screen via a webcam.

Kamiya: So you mimic the same movements.

Mitsukura: Yes . We're researching how to capture a person via webcam, attach 8 points to the video, and have the character follow that person's movements and expressions. Then there's locating where a sound is coming from, like having a robot move toward the sound, or if it hears "Come here," it moves toward that direction.

Kamiya: BCI (Brain Computer Interface) too?

Mitsukura: Yes , we are. Earlier I mentioned brainwave patterns, but blinking also has patterns. Beyond regular blinking, we developed a wheelchair controlled by eye biometric signal patterns like winking or closing both eyes. The wheelchair moves in the direction of a wink, and goes straight when both eyes are closed. By defining these patterns, we can control the wheelchair.

Meeting the "necomimi" Team

Kamiya: After "necomimi," we were looking for algorithms beyond "concentration and relaxation." We learned Professor Mitsukura was researching the five senses, so we barged into his lab to talk. I imagine many others came for the same reason?

Mitsukura: Yes , we'd get five or six requests a week asking to "come see your work." When Kamiya-san and Naka-san visited, I felt such strong motivation from them. It was this kind of hunger for creating things. When I sensed that drive, I intuitively thought, "Collaborating with them would be truly rewarding." I'd never failed when acting on intuition before, so I instantly felt this would be the same. Normally, joint research takes time to decide, but when I met everyone, I was the one who said, "How about we work together?"

Kamiya: That's right.

Mitsukura: Quite a few companies came by, but none of them had the same impact as when you all arrived.

Nakano: So it was shocking? I didn't know that (laughs).

Kamiya: What kind of impact was it?

Mitsukura: Of course, I knew about " necomimi," but the story—how "next we'll do this, we're also thinking about that, and we want to do this other thing"—was incredibly fresh and ambitious. But you laid out a clear roadmap for it, and I thought, "This could work." It was pure intuition, really.

Kamiya: That it could work.

Mitsukura: " You wear an EEG, and your ears move." We never would have thought of that. "Huh? That part?" (laughs).

Kamiya: It's not something you'd normally think of, right? And he explained it with a completely serious expression.

Mitsukura: When I saw Kamiya-san wearing necomimi, explaining with a deadpan expression how the ears move , or how the tail might move next... I thought, "This is amazing" (laughs).

Nakano: So that was the trigger, huh?

Mitsukura: Nakano-san , just like today, wears an EEG monitor in public and does all sorts of things with it, completely seriously.

Nakano: I came here while doing a little experiment. I didn't want to waste the time walking.

Mitsukura: You were wearing an EEG when you first came to my lab, right?

Nakano: I was experimenting then too, with something else (laughs).

Mitsukura: I found that deeply serious, dedicated attitude incredibly appealing.

The Vision of Science Jam

Kamiya: Professor Mitsukura, what kind of technology are you ultimately aiming for?

Mitsukura: Regardless of the device , the goal is for thoughts to appear as text. For example, patternizing all brainwaves for "a," "i," "u," "e," "o," "ka," "ki," "ku," "ke," "ko," and displaying characters. It's like a brainwave version of Siri. Beyond that, I envision a system where you communicate by forming frames with your hands, even without a device. Google Glass has that "I'm wearing something" feel, right? That's why I'm considering a contact lens-type head-mounted display.

Nakano: The future you describe sounds like magic now, but I think it'll eventually become as commonplace as using a cell phone. What do you think, Kamiya-san?

Kamiya: I'm not originally from a science background, but by putting out this weird thing called "necomimi," I was able to provide topics of conversation and laughter. At Dentsu ScienceJam Inc., I hope we can continue discovering all kinds of science and make the world more interesting.

Location: Dentsu ScienceJam Inc.

Was this article helpful?

Share this article

Author

Toshitaka Kamiya

Toshitaka Kamiya

Dentsu ScienceJam Inc.

At Dentsu Communication Design Center (CDC), Next Generation Communication Development Department, launched the neurowear brand and was responsible for producing and developing the brainwave communication tools "necomimi" and "mico." In August 2013, established Dentsu ScienceJam Inc., challenging new business development possibilities by combining scientists' intelligence and cutting-edge technology with Dentsu's unique ideas.

Mitsukura Yasue

Mitsukura Yasue

Keio University

Since 1999, he has served as an assistant professor in the Department of Intelligent Information Engineering at Tokushima University's Faculty of Engineering, a full-time lecturer at Okayama University, and an associate professor at Tokyo University of Agriculture and Technology. He is currently an associate professor in the Department of System Design Engineering at Keio University's Faculty of Science and Technology. Since August 2013, he has concurrently served as Chief Technology Officer at Dentsu ScienceJam Inc.

Nakano Kana

Nakano Kana

Dentsu Inc.

Our specialty lies in leveraging insights gained from researching domestic and international technologies to shape the future use of technology and transformations in communication into tangible experiences. Examples include "iButterfly" (2010), where users catch AR butterflies with coupons via smartphone; "necomimi" (2011), a cat-ear communication tool using brainwaves; mononome (2014), an IoT device visualizing the feelings of objects; Onigilin (2016), a mindfulness meditation training device; and the "UP-CYCLING POSSIBILITY" project (2023), a future kintsugi technique embedding functionality into broken objects. Hobbies include reading books about living creatures and food, and wandering around cities.

Also read