Combining all the possibilities together, up to mixed reality.
Experiencing unexpected combinations leads to mixed reality, spaces of languages and emotions. Mixed-reality is that system in which physical and virtual worlds merge together. Predictive and generally computational models are used in the virtual world to increase experiences and knowledge on visible reality. Digital photography has almost completely abandoned the dark red light caves, and increasingly uses software and dialogues with integrated systems executing various functionalities. In particular, such calculation systems are assembled according to architectures into hardware structures. Let’s take an ordinary smartphone or a PDA of the first twenty years of the Twenty-First century. They will include a digital photographic resource, objects that are increasingly interfaceable and able to communicate with each other. It is precisely this transversality and reversibility of the elements that move the concept of photography further. Controls and stimulations are the cause of a different approach both from the learning point and from the movement generated to obtain the expected results. The engineer and the designer are increasingly in symbiosis, a bit like the photographic concept at its birth.
The purposes are many: from visualization, to prototyping, to commercial developments. A verisimilitude to the world of reality is often required to make familiar and natural the context. This implies that the lighting and shadow conditions must be considered in the final rendering. In addition to photographic post-production, up to digital art, architectural studies use the method of image reconstruction, which it can be through sensors, with the purpose of obtaining real images, sometimes so photorealistic to lose the artificial taste. When rebuilding an environment, the starting point is to calculate the radiance distribution through optical instruments. Then proceeding with the numerical calculation in order to elaborate algorithms of luminance approximation, direct and indirect, and of probability. Calculations so precise and complex that they can also become real-time. Thus, it is possible to create hundreds of frames, to be then mechanized in successive dynamics, both simulating visual effects of light and interactions. The production of photorealistic reality is growing in visualization applications and augmented reality.
© Maria Chiara Fagioli, AR book Wonder Branada, 2015, AR digital photography
Notes: Thanks to the augmented reality you can see the evolutions of Spartacus’ flight, a seagull traveller. Given a photograph and through a third-party application, the photograph comes alive.
This technological requirement has led to the creation of methods and tools to identify the brightness level and create scenes of composite objects, returning mixed-reality renderings.
In this sense, the truthfulness of the images can be questioned about the difference from a human eye, so when the image coincides with photography, and therefore also on the ethics of production. It is exactly when there is an act of progress, that it is difficult to understand if what we depend on is at our service or against our humanity. For this, the effects of the technical frenzy boomerang might be foreseen and assimilated within our societies. The exploitation of resources is not understood only as a form of coercion on the environment, but also on our nature itself. Every change affects the whole world balance. When a precarious and exploited existing state accomplishes a model that leads to technological unemployment and technical improvements, a shared and widespread permanent prosperity opens a diatribe, in the name of new cyclical austerities. At best, humanity could give birth to sedated, specialized beings, and unaware of the greatest right of life, but overwhelmed by consumption that from time to time will change name and form. So, here we come to the army of Google Robot that automatically processes and recognizes the best photographs of Street View, and drones that fly over our future. Precisely at the frontiers of artificial intelligence and the issue of giving robots a black box, there’s a need to understand the limit to the use of robotic means before their industrialization, and the modality of our interaction, in the most extreme war and pornography scenarios. The artefacts of these new realities, in the photographic sector and in general in the real world, will define the ethical boundaries of this new humanity, if abnegation or new dignity development, and its interaction, aesthetic perceptive and experiential.
In the beginning it was the individual image.
Bull, G. & Bredder, E. (2013) ‘Mixed reality.’ Learning & Leading with Technology. [Online]. 40(5). p. 10. Available from: http://herts.summon.serialssolutions.com.ezproxy.herts.ac.uk/#!/search?bookMark=ePnHCXMw42LgTQStzc4rAe_hSmECzeqamOoCWyYRLODF7sDOj7mpCQesuAS2xw05GXh9MytSUxSADSVQ65OHgaWkqBRYPsq5uYY4e-iCCst4UIgDe__J8cbAbhWwFWACuuiKgAIAIRAl4w . [Accessed: 16/12/2017].
Knecht, M., Traxler, C., Mattausch, O. & Wimmer, M. (2012) ‘Reciprocal shading for mixed reality.’ Computers and Graphics (Pergamon). [Online]. 36(7). p. 846-856. Available from: http://science.sciencemag.org/content/322/5909/1800.full . [Accessed: 16/12/2017].
Kronander, J., Banterle, F., Gardner, A., Miandji, E., Unger, J. (2015) ‘Photorealistic rendering of mixed reality scenes.’ Computer Graphics Forum. [Online]. 34(2). p. 643-665. Available from: http://science.sciencemag.org/content/322/5909/1800.full . [Accessed: 16/12/2017].
Nebbia, g. (2017) Etica e produzione. [Online]. Available from: http://www.minerva.unito.it/Epistemologia&Etica/Articoli1/NebbiaEticaProd.htm . [Accessed: 16 December 2017].
saracco, r. (2017) This Google robot is a good photographer!. [Online]. Available from: http://sites.ieee.org/futuredirections/2017/07/20/this-google-robot-is-a-good-photographer/ . [Accessed: 16 December 2017].
Sharkey, n. (2008) ‘The Ethical Frontiers of Robotics.’ Science. [Online]. 322(5909). p. 1800-1801. Available from: http://science.sciencemag.org/content/322/5909/1800.full . [Accessed: 16/12/2017].