Contractual pitfalls: building a virtual e-commerce store

Bird & Bird lawyers Nicola Conway, Shima Abbady and Bridget Chamberlain look at the opportunities for virtual e-stores and provide seven helpful tips for the build-out

Reports say virtual customers linger longer in e-stores than physical stores Rawpixel

Retailers are reporting that customers will dwell in a virtual store environment for 73% longer (and that purchases will thereby increase by 184%) as compared with a physical store offering the same goods, according to Vogue Business Technology Forum, 2023.  

These numbers can be enhanced further by introducing gamification elements and other immersive experiences: AR try-on functionalities (particularly for fashion and cosmetic goods); social or other interactive capabilities (such as chat rooms or events); personalisation in terms of how the user sees the store and products; and virtual sales assistants that can assist with questions or guide customers through the virtual environment. 

However, venturing into these technologies will mean that most traditional retailers need to engage third parties to build, develop and deploy the virtual store – and this new contractual relationship can create risk and liability exposure that is critical to bear in mind throughout a virtual expansion. We set out below the benefits of operating a virtual store, followed by recommendations for contracting to build, develop and deploy the same.

Benefits of operating a virtual store

⦁ As demonstrated by the figures above, the implementation on e-commerce platforms of virtual environments, gamification elements, immersive or interactive experiences, AR try-on-tools and AI assistants is increasing consumer dwell time and increasing sales at the checkout.

Example 1: The introduction of virtual try-ons in the cosmetics sphere has been proven to bolster customer purchasing habits. In a physical store, customers cannot test 10 foundation shades or try 30 lipstick colours in the space of a minute to find their match. With an AR preview, they can.

Example 2: From a fashion perspective, guessing size or fit when shopping online can be frustrating to consumers particularly because: (a) different brands (nationally and internationally) use different numbering, sizing and fit rules; and (b) most consumers do not look like the models featured in product images. However, AR and VR now enable customers to upload photos of themselves, or at least their own personal measurements, to see what a garment or accessory will look like on or proportionate to their own body and as against their personal appearance.

Brands are incentivised to use the technology as set out in both examples to reduce returns, which are economically problematic for the brand, unsatisfying for the consumer and damaging to the environment.

⦁ When the matching and selection process is made easier, decision making is faster and checkouts increase. As a bonus, experimentation with appearance is something that consumers like to share on social media, thereby raising brand profiles.

⦁ Additionally, consumer trust in online shopping for high value and luxury goods is improved where retailers provide high-quality 3D images (or videos) of products that customers can rotate and magnify such that they (for example, with a garment or handbag) can see the quality of the stitching. 

Contractual recommendations 

We know that digital tools and solutions can be used to facilitate and enhance the online shopping experience, but for most retailers this necessitates contracting with a third-party service provider that can carry out the build, development and/or deployment of the technology. 

Our top seven recommendations for contracting in this space, with appropriate safeguards, are as follows:

1. Liability: Keep an eye out for service providers seeking to exclude their liability for matters that are actually within their control. Determining the extent of what is within an AI service provider’s control is difficult due to the nature of AI and the ‘black box’ problem it creates whereby a system can develop after purchase (and, indeed, such development is often based on the data provided by the customer and others). The service provider should be asked to provide an indemnity to protect the customer in case of damage caused by the output of an AI system. This is especially important considering the often complicated value chain involved.  

2. Ownership of rights: Consideration must be given as to the ownership of rights in the API and its outputs. The data and other outputs generated by the service will, from the customer’s perspective, be something that should be owned by that customer, but the nature of the service may mean that it is restricted in its ability to use that data outside of the confines of the AI system itself. In addition, there is a more fundamental question around who owns rights that are effectively created by the AI system itself, without human involvement. In any event, the licence should extend to whatever use cases the customer has in mind (i.e. there should be no relevant restrictions on use and no requirements of, for example, attribution).

3. Confidentiality: It will be in the customer’s interests to have strict confidentiality protections, which curtail the service provider’s ability to re-use input data.  

4. Personal data: If personal data is likely to form part of data inputs and outputs, the parties must consider the GDPR (General Data Protection Regulation). The customer will need to understand what specific actions are being taken with the data by the AI solution, as well as the service provider’s own data protection and security practices, to ensure that there are appropriate protections in place. The customer may insist on an indemnification in relation to the use of personal data in training the system.

5. Age restrictions: Where consent is used as a basis for processing and parental permission is required (e.g., for under 16s or under 13s depending on applicable local law), the normal restrictions apply.  In general, data protection regarding minors is a hot topic for regulators throughout the UK and EU.

6. Disclaimers: Retailers will want to place adequate contractual obligations and warranties on the service provider to ensure that the end-customer’s experience is as accurate and seamless as possible. Even so, retailers will need to ensure that adequate disclaimers are included in their terms with end-customers so that those end-customers cannot seek recourse if, for example, a lipstick does not look exactly the same through the AR as in real life. They will also need to be transparent about the limitations of AI systems, for example, end-customers should be made aware if they are interacting with a chatbot instead of a human. 

7. Reputational risk of using AI software: Retailers who have built their brand on and around certain values or ethics should carefully consider how and by whom the AI they are looking to use was developed, as failure to do so could lead to reputational harm. End-consumers may be sensitive to the AI service provider having received financial backing from a particular state, military or other organisation which does not reflect what they perceive to be the retailer’s values (even if the use of the AI technology is perfectly legitimate). Retailers should ensure that they carry out broad pre-contractual due diligence to ensure that the development has unfolded through a supply chain that they are comfortable with. They should also ensure that specific provisions are included in the contract to address potential reputational damage through an ethics-by-design approach and to prevent certain behaviours. This could include: 

  • The retailer being able to review the supplier’s AI development governance principles as part of pre-contractual due diligence and throughout the life of the contract;
  • The right to audit and monitor the AI service provider’s AI development processes;
  • Having specific undertakings in the contract that the supplier will comply with certain ethical principles; 
  • Rights to terminate the contract where the AI service provider’s supply chain is found not to align with these principles or where the AI service provider amends its AI development governance principles during the term of the relationship in a way which does not align with the retailer’s core values.

Nicola Conway, Shima Abbady and Bridget Chamberlain are associates with Bird & Bird. Contact Nicola Conway at nicola.conway@twobirds.com or your usual Bird & Bird contact with any questions or issues raised in this article.

Email your news and story ideas to: news@globallegalpost.com

Top