Difficulty in designing pipeline of concept work

(Polygon Pictures Inc. / Studio Phones)
This document has been redacted based on post-seminar discussions. We plan to add more content and make further revisions based on the situation of future seminars.
translated by PPI Translation Team


Before production begins, a process known as pre-production generally takes place. Pre-production is the stage in which the project is designed through planning, story development, concept art, various settings, character design, and color scripts.
Given that all of the designing mentioned above will become the foundation for carrying out production, the asset data and accompanying meta information created during pre-production will need to be managed in the same way as during the actual production. As multiple artists will be involved during pre-production, a pipeline system which allows for them to work on their tasks smoothly would be essential.
However, the pre-production stage differs from production in the sense that there are many factors that are uncertain to begin with, which is why various trial-and-errors would be executed frequently as team members work collaboratively towards establishing the design. Given that time is limited, how often the iterations can be accomplished is a crucial point for the artists. For this reason, a workflow which enables the artists to concentrate as much as possible on their tasks becomes necessary. Ultimately, this may become one of the factors that ends up affecting the quality of the project itself.
The standard video production workflow follows the waterfall model or another design approach similar to it. The pipeline system cultivated through this production workflow would prove difficult to implement onto the pre-production stage as the characteristics of the two workflows greatly differ.
Hence, a more agile workflow is required for pre-production, and the construction of a pipeline system specifically catered to it is imperative.

■Pre-production workflow

Shown below is the standard pre-production workflow:

Pre-production begins with staged story development. Through numerous trial-and-errors, the developed story is molded into a scenario or script. In addition, the more the story is developed, the more other tasks such as creating concept art, designs, and color scripts can move forward.
While the work above is in progress, the output is continuously brushed up through various iterations. At times, this results in entire assets being omitted and having to be redesigned from scratch.
In this manner, trial-and-errors are repeated during pre-production until the packaged data to be delivered to production is finalized. Therefore, through proper management of the colossal amount of information and scattered data, the work would be able to be done in a more efficient manner. On the other hand, when there is no proper management, it becomes unclear which data was selected for the finalized package, and whether or not that data had already been approved by the director. This confusion will end up negatively impacting the process which follows; production.
When there is no pipeline or system established, the management of this previously mentioned data would become dependant on the individual. Therefore, it can be predicted in certain cases that the task of data management itself would end up becoming the bottleneck within the workflow.

■Example of A pipeline system for pre-production

Using the pre-production system “Pinoco” developed in-house at Polygon Pictures as an example, here we will explain the contents of a pipeline system.

During pre-production, a high volume of data such as images, videos, audios, documents, and 3D models is created. Pinoco is a data management pipeline system meant exclusively for use during pre-production. It allows users to manage data revisions as well as the meta and status information of the data in one central location. The data is then sorted by adding tags. Pinoco is a system operating on a browser-basis, and was consciously developed to allow as much intuitive use as possible, such as drag-and-drop.
Furthermore, as Pinoco was designed based on the assumption that the data would be updated continuously due to agile trial-and-errors as part of the workflow, it comes equipped with the function to search the data from various angles.

© Gamon Sakurai, KODANSHA/AJIN Production Committee. All Rights Reserved.

Since the data will be frequently updated, by simply uploading the data the revision will be automatically updated and stacked up. It is also possible to add comments when uploading the data as well as communicate with others in the form of chat regarding the data revision.

© Gamon Sakurai, KODANSHA/AJIN Production Committee. All Rights Reserved.

In Pinoco the data is sorted through the use of tags. These tags can be added through drag and drop.

© Gamon Sakurai, KODANSHA/AJIN Production Committee. All Rights Reserved.

Pinoco is equipped with the function to do direct paintovers of instructions, which enables users to carry out simplified reviews and return feedback within the system itself.

© Gamon Sakurai, KODANSHA/AJIN Production Committee. All Rights Reserved.

In the case of directors, the process of reviewing and approving the created data can be done from within the system as well. Directors are also given the option to add notes and memos for production.

© Gamon Sakurai, KODANSHA/AJIN Production Committee. All Rights Reserved.

The data created during pre-production would need to be handed over to off-site staff as well as sent to production. Since the meta and status information stored alongside the data in the database would also need to be sent at the same timing, Pinoco will automatically render that data and attach it to the image.

© Gamon Sakurai, KODANSHA/AJIN Production Committee. All Rights Reserved.

Although the data size for each individual asset is not large during pre-production given the various trial-and-errors and creation of data based on ideas, there is a tendency for the number of data items ending up being high. Therefore, by expending minimal time and effort in sorting this data, more time can be allocated for creative tasks.
While an asset management system for the production stage had already been developed at Polygon Pictures, this system was not apt for managing data during the pre-production stage. This led to the decision to develop a new system, “Pinoco”, exclusively for pre-production.

■Pipeline and infrastructure challenges in pre-production

As mentioned above, during the pre-production and concept work stage, ideas are laid out and various trial-and-errors are executed in order to develop the design of the project.
Given that a diverse group of artists would be involved with pre-production, how to correctly share the data and information among the team members becomes an important point. Also, due to the nature of the pre-production stage, it is unlikely that all members of the pre-production team will be working from within the same studio. Since different members will be working from different locations, the method for sharing the data will consequently be more complex. In addition, since this data will belong to the planning stage a high level of confidentiality will be required, meaning that from a security standpoint as well, adequate care must be given.
Before the implementation of “Pinoco”, most of the data and information at Polygon Pictures was managed manually. However, there was a limit to the amount of tracking that could be done for data which was being updated every day. In time this lead to issues, such as data that the director had originally not intended to share ending up in the production floor.

Also, since the data created during pre-production is generally meant to be used during production, once the data is made it would need to be handed over to production. In the case that the pre-production pipeline system is independent of the production pipeline system, most likely there would be a need to devise a way to link the two pipelines. In addition, due to the fact that the database structure is subdivided by asset or shot during production, the particle sizes of the data categories differs from pre-production. Therefore, the pre-production data would need to be mapped onto the database structure for production, and doing so manually would result in highly elevated costs.

While there are significant merits to converting the database into a web application, the actual work is done locally by the artist through the DCC tool installed in their machine. Therefore, if the database were to be web-based the artist would have to do the additional work of downloading the necessary data from the web system and uploading the data once again after completing their task. The ideal would be to be able to do the same DCC tool operations through a web system or server-side instead of locally, as this would eliminate the need to download and upload data, thus making it possible for team members to work and share data in a more efficient manner. In order for this to be achieved, the functionality of web systems would need to be expanded further. However, realistically speaking, at the moment this would be a highly difficult system to develop given there are certain technical issues that would need to be resolved.
With that being said, for example, the challenges mentioned above would be different in the case of Derbies, which was introduced by Studio Phones during the seminar. Given that Derbies is a system with a high level of operability within the command line itself, the handling of data and such would be done through the open source JupyterHub, meaning that Derbies can function correctly without having to download or upload data between the server and the artist’s machine. The system issue Studio Phones faces is opposite that of Polygon Pictures. While Derbies does operate in a convenient manner, it also strongly demands that the artist to be skilled in order to use it, unlike the DCC tool which could be easily used by anyone. The type of system most fitting for the in-house staff would be largely dependant on the culture of the studio itself, and we believe the differences between the systems of these two companies reflect that reality.

Nonetheless, the challenges that both Polygon Pictures and Studio Phones face stem from the fact that the workflows, resulting data, and management methods vary greatly from pre-production to production. In conclusion, one can assume that there will be a demand for the development of technology which would be able to handle even more complex data flows seamlessly as part of the pursuit of finding a way to link pre-production and production into one unified pipeline system.