The method may include receiving, at a user interface generated for the electronic device, an indication to access an augmented reality fitting environment associated with at least one shopping item; receiving a selected image of a location in which to display the augmented reality fitting environment; obtaining user measurement data, shopping item data, and emotions analytics data associated with the augmented reality fitting environment; and generating, based on the obtained user measurement data, a plurality of three-dimensional avatars representing a user accessing the augmented reality fitting environment on the electronic device. For each avatar, the method may include generating, using the shopping item data and the user measurement data, a unique three-dimensional representation of the at least one shopping item, generating emotive content based at least in part on the emotions analytics data and on the respective generated representation of the at least one shopping item, and triggering, in the user interface, display of the plurality of three-dimensional avatars in the augmented reality fitting environment and within the selected image of the location, each avatar being depicted with the respective generated emotive content and wearing the respective generated representation of the at least one shopping item.
Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.
Clement, Manuel Christian and Kumar, Anshuman, "GENERATING DYNAMIC EMOTIVE ANIMATIONS FOR AUGMENTED REALITY", Technical Disclosure Commons, (December 13, 2019)