You can help expand this article with text translated from the corresponding article in Japanese. (September 2018) Click [show] for important translation instructions.
View a machine-translated version of the Japanese article.
Machine translation, like DeepL or Google Translate, is a useful starting point for translations, but translators must revise errors as necessary and confirm that the translation is accurate, rather than simply copy-pasting machine-translated text into the English Wikipedia.
Consider adding a topic to this template: there are already 1,422 articles in the main category, and specifying|topic= will aid in categorization.
Do not translate text that appears unreliable or low-quality. If possible, verify the text with references provided in the foreign-language article.
You must provide copyright attribution in the edit summary accompanying your translation by providing an interlanguage link to the source of your translation. A model attribution edit summary is Content in this edit is translated from the existing Japanese Wikipedia article at [[:ja:Live2D]]; see its history for attribution.
You may also add the template {{Translated|ja|Live2D}} to the talk page.
Live2D is an animation technique used to animate static images—usually anime-style characters—that involves separating an image into parts and animating each part accordingly, without the need of frame-by-frame animation or a 3D model.[1][2][3][4][5][6] This enables characters to move using 2.5D movement while maintaining the original illustration.
There is a common misconception that Live2D refers to the software used to create Live2D animation.[7][8] Due to the popularity of Live2D Cubism, people often equate Live2D (the animation technique) with Live2D Cubism (the software). However, Live2D is an animation technique, not the software used to create Live2D animation. There are other software options used to create Live2D animation, such as Inochi2D or E-mote (which is used in Tokyo School Life).[9]
Live2D models consist of layered parts saved as a Photoshop file (.psd format). Layers are separately moved to show the whole animation and expression of the character, such as tilting the head. Parts can be as simple as face, hair, and body, or they can be detailed to eyebrows, eyelashes, and even effects like glinting metal.
The number of layers depends on how you wish the Live2D character to move and how three-dimensional you wish the result to appear, with a simplified model having 50 layers and large complex projects reaching 750 layers. There is no limit to how detailed you can be, with some even modelling the sides of the teeth for full effect.[10] Unlike a 3D model there isn't a traditional skeleton, but instead the flat layers are warped and rotated.
Live2D can be used with real-time motion capture to track movements such as head movements, eye movements, and perform lip syncing for real-time applications such as vtubing. The downside of the technology is that there is little capability for 360° rotation of complex objects and body tracking.
Live2D was first introduced in 2008 to resolve the need for interactive media.[17] Since then, the technology has also changed how games enhance user experience through lively characters and expressions.[18]
In 2009, Cybernoids Co. Ltd. (now Live2D Ltd.) released their very first Live2D application, Live2D Vector.[19] The application transforms vector graphics to make flat character images achieve three-dimensional head turning and moving effects. Although such characters can only perform limited activities, they perform much better than static pictures or slideshows. Users can also customize their moving character by adjusting parameters through software or collecting materials such as images of different angles of a character. Of course, vector graphics still have many limitations. Although the occupied capacity resources are reduced, the rendering of complex images consumes a lot of CPU and RAM. Also, while it can be difficult to work with more traditional art styles such as oil painting or gouache styles, VTuber creators have been experimenting with these styles with success.[20]
The first application of the Live2D technique is HibikiDokei released by sandwichproject (株式会社レジストプランニング), an alarm app released in 2010. The alarm app has a girl character named "Hibiki" who talks and moves.[21]
In 2011, Live2D adopted PSP game Ore no Imōto ga Konna ni Kawaii Wake ga Nai Portable released by NAMCO BANDAI Games Inc became the first game the O.I.U system derived from Live2D technology was applied in a game, where the character moves and changes positions and expression while talking to the player.[22] Characters moved expressively on the screen and seamlessly like an anime, which surprised players and triggered the popularity of Live2D.[23]
Software developer Tetsuya Nakashiro had been independently developing Live2D software and founded the company Cyber Noise (or Cybernoids, Japanese:サイバーノイズ) in 2006 with support from the Exploratory IT Human Resources Project of the Japanese Information Technology Promotion Agency (IPA). Because of its novelty and lack of uptake, Cyber Noise was unsuccessful.
In 2011, Live2D software received attention after its use in the PSP game Ore no Imōto ga Konna ni Kawaii Wake ga Nai Portable. It subsequently received interest as a library for Android and iOS. Following this success, in 2014 Cyber Noise subsequently renamed itself to Live2D Ltd.,[24] unifying with its product name. Sales of Live2D have had significant growth since then. In 2021, 70% of Live2D Cubism Pro users are or create Vtubers, followed by games/app developers and animation/video creators.[25]
Live2D Euclid (released in April 2017, no longer available from October 16, 2018)
Live2D CubismM
Official marketplace
nizima: nizima is a platform where users can buy and sell illustrations, Live2D data, or make-to-order transactions. Illustrators and Live2D creators are able to work together on a character and share sales on the platform. The platform also provides a Live2D preview for users to see and move the model before purchasing.[26]
nizima LIVE: nizima LIVE is a PC application that allows anyone to easily move a Live2D model by recognizing and tracking facial expressions. From beginners who are new to the tracking app to intermediate and above who are already familiar with Live2D, you can enjoy it for various purposes such as VTuber activities, and avatars for calls and meetings.
^Fujiwaki, Minamo (26 September 2022). "工数を度外視してでも"差別化"を…『勝利の女神:NIKKE』の飽くなき挑戦". PickUPs! (in Japanese). Retrieved 27 July 2024. 弊社にはLive2Dで培ってきたノウハウがありますが、実は『勝利の女神:NIKKE』ではツールを変えSpineを採用しています。