Have you ever felt like you want to momentarily sink into the awesome world of Japanese comic books (manga) ? Well, this new innovation might just do the trick for you.

mangagen00 Manga Generator lets you dive into the comic book realm

The Manga Generator is basically a program that combines software and sensors to allow one person to be “immersed” in a live, virtual world within several comic book frames, making it look like you’re a character within a comic strip. Users can position themselves however they want to, and the program would then automatically adjust, editing speech balloons and other stuff inside each frame to adapt. It is like you’re creating each frame live as the pre-created story moves and unfolds, using your own body, and using the frames themselves as visual reference.

The concept was introduced during last year’s International Collegiate Virtual Reality Contest (IVRC) 2012, where it won third place, with the idea itself currently undergoing research and development. The highlight of the concept is the fact that the program could automatically edit itself in real time in accordance to the movements of its user.

Though there are several features and apps in smartphones that already allow the same functionality of creating comic strip-like images, it is said that they are comparatively limited, at least in the way they are used. The main reason for this is that the program lets you dive into the story of the frames itself in real time, and the user is free to position their entire body for the virtual comic strip in however way that they want to.

The developers admit that there is still a lot of room for development, particularly in areas where there should be a variety of editing choices for the user (background layouts, alternative lines and positional shading, among others). However, they are confident that development is well on its way towards practical application. In fact, they expect that the concept would be highly useful in developing new types of interactive E-comic books in the future.

Source: DigInfo (JP)