hash stringlengths 32 32 | doc_id stringlengths 5 12 | section stringlengths 5 1.47k | content stringlengths 0 6.67M |
|---|---|---|---|
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.13.5 Existing features partly or fully covering the use case functionality | This feature is currently not documented in the 3GPP specifications.
Concerning the user identity related aspects, the features described in the document TR 22.904 [X] can be applied. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.13.6 Potential New Requirements needed to support the use case | [PR 5.13.6-1] The 5G system shall support allow a user to securely manage a digital asset container (e.g. store and update the information associated with this user).
[PR 5.13.6-2] The 5G system shall support mechanisms to retrieve the information of a digital asset container associated with a user by an authorized third party.
[PR 5.13.6-3] According to the service invoked when a user accesses an application platform, the 5G system shall support mechanisms to provide the information associated with the user to a third party.
[PR 5.13.6-4] The 5G system shall support mechanisms to certify the authenticity of the information of a digital asset container associated with a user.
[PR 5.13.6-5] The 5G system shall protect against spoofing attacks of the customer’s digital asset container. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.14 Use Case on interconnection of mobile metaverse services | |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.14.1 Description | The concept "mobile metaverse" and "metaverse" became popular during the coronavirus pandemic as lockdown measures and work-from-home policies pushed more people online for both business and pleasure, increasing demand for ways to make online interaction more lifelike. The term covers a wide variety of location agnostic service experiences, from workplace tools to games and community platforms. It generally refers to shared and immersive digital service experiences (i.e. mobile metaverse services) that people can experience by means of using XR devices. By 2026, 25% of people are estimated to spend at least one hour a day using services that provide immersive XR media for work, shopping, education, social media and/or entertainment, according to the latest study by Gartner, Inc. (a U.S.-based technology research and consulting company).
Mobile metaverse service technologies are still in the early stages of being adopted. Currently there are already many digital environments (i.e. mobile metaverse services that offer location agnostic and location related service experiences), which typically run in silos and are not interconnected. From end users’ view point, there are several basic requirements to be addressed:
- Depending on the immersive XR media service the user wants to connect to, he/she can choose his/her digital representation and the related information when needed: avatar (one or more), e-money (e.g. financial services as payment, his/her means of payment), ID, purchased items…
- A user is able to transition between immersive mobile metaverse services using the similar digital representation seamlessly and taking into account the constraints of the mobile metaverse services accessed. The transfer of information via the operator's network ensures the semantic compatibility – possibly through abstraction - between the origin and the destination. This also ensures, if necessary, the confidentiality of the origin and the destination.
In the following use case, Alice uses an immersive mobile metaverse service of a travel company, and a trip interests her. She would like to verify if she has enough money to buy the trip. She needs to link the immersive mobile metaverse service of her bank with the current immersive mobile metaverse service of the travel company so that her profile is automatically shared between these two services (she has previously given authorisation).
In this use case 'digital representations' are expected to interwork with specific services. The use case does not propose to define a new 'standard' for digital representations (avatars, electronic money and financial service, IDs, purchased items.) Rather personal information is stored and retrieved to improve service delivery and to assure privacy and security. If some formats, etc. are shared between different service providers, this would present an opportunity for consistency and continuity for the user of those two services, enabled by this use case. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.14.2 Pre-conditions | Alice has a service subscription with the network operator M4Mobile, which is for communication services. When visiting immersive mobile metaverse service Alice uses digital representation which contains her avatars and other information like her electronic money and associated financial services, IDs, purchased items …
Alice has chosen a profile (a subset of her digital representation) for her session to access universes. The choice of information can be automated (without action on the part of the user) depending on the universe visited or already visited, the user configurations, privacy options, etc. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.14.3 Service Flows | 1. Alice would like to buy a travel using mobile metaverse service.
2. Alice connects to the immersive mobile metaverse service A of a travel company with the information she authorises to share for the successful provision of the service (the purchase of the travel).
3. During her session Alice is interested in buying a trip, for which she needs to interact with the mobile metaverse service B of her bank.
4. When moving between these immersive mobile metaverse services A and B, the network operator provides the same user information (for instance regarding the digital representation used to connect to the original mobile metaverse service…) in accordance with the configurations and the rights granted by the user.
5. Information may be coded differently in the immersive mobile metaverse service A and in the immersive mobile metaverse service B (e.g. the level of graphical accuracy of an avatar). In this case, a negotiation by the network operator may be necessary to adapt the information received from A to B. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.14.4 Post-conditions | Alice appears in the universe with the digital information chosen in her wallet (with some certified via the network operator). She keeps that information as she travels from universe to universe. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.14.5 Existing features partly or fully covering the use case functionality | This feature is currently not documented in the 3GPP specifications.
Concerning the user identity related aspects, the features described in the document TR 22.904 [X] can be applied. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.14.6 Potential New Requirements needed to support the use case | [PR 5.14.6-1] The 5G system shall support suitable APIs to securely provide information of a user to an immersive mobile metaverse service when the user accesses the service.
[PR 5.14.6-2] The 5G system shall support mechanisms to adapt the user assets and information stored by one immersive mobile metaverse service with the information needed, or requested, by another immersive XR media service. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.15 Use Case on Access to avatars | |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.15.1 Description | Mobile metaverse services often involve the use of digital representation (e.g. avatars, which are discussed throughout this use case.) Given different use cases, the data associated with the avatars of a user are generated and stored in different mobile metaverse servers. For example, a user uses life-like avatars for e-commerce and cartoonish avatars for gaming. Network operators enabling users to obtain diverse mobile metaverse services should support avatar management. For example, network operators can leverage their existing connections to extensive mobile metaverse servers and provide access to avatars across these servers acting as a proxy. Compared to the model where two mobile metaverse servers define direct access APIs, the interconnect model described in the use case can utilise the 5G system capability of authenticating and authorizing the third-party entities.
The advantage of having a central storage of the information related to avatars is that the same avatar could potentially be used in different mobile metaverse services. The information exposed by the central point, i.e. the 5G system, to different mobile metaverse services helps them share and use the same avatar for a user. Users would therefore benefit from using their UE/ mobile access for metaverse services because there are enablers (like this one) that provide consistency between different metaverse services.
It is noted that the storage location of avatars is subject to service agreement between the operator and the third-party entities, and hence it is out of scope of 3GPP. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.15.2 Pre-conditions | ClothingA and ClothingB are two small clothing companies that both have virtual stores and provide avatar-based shopping services. Online shoppers can use immersive real-time technology to virtually try-on apparel, accessories, or full looks on the digital representation of themselves, i.e. avatars. Their avatars are stored in mobile metaverse servers, and interoperable data formats between these servers are used for avatars.
T is a mobile network operator. Based on its service level agreements with ClothingA and ClothingB, it provides multimedia communication services to enable virtual shopping. Moreover, T behaves like a proxy and supports the exchange of avatars stored in the databases of ClothingA, ClothingB, and any other companies that have agreements with T.
Shaun, an online shopper, has used the virtual try-on service provided by ClothingA several times. His avatar–a 3D actual visual representation of himself is stored in the ClothingA database. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.15.3 Service Flows | 1. ClothingA and ClothingB register with the operator T. Shaun registers with T by a UE that has a subscription with T.
2. Shaun visits the ClothingA virtual store using his avatar stored in the ClothingA database. He is authenticated by T and ClothingA, and a multimedia communication session is established between Shaun, a shop assistant, and associated devices (e.g. AR glasses). Shaun tries on some products and sees 3D digital clothing automatically appear on himself.
3. Shaun terminates the session with ClothingA. His user profile on T’s system is updated with the information that an avatar is stored in the ClothingA database. Parameters linked to this avatar in the user profile may include
- last access time. This could potentially help a user select which avatar to use and help the 5G system determine if the avatar is still available;
- authorised mobile metaverse services;
- address (e.g. IP address). This could potentially help a trusted third party retrieve the avatar.
4. Having authenticated by T and ClothingB, Shaun visits the ClothingB virtual store for the first time. Since it is his first visit, Shaun has no avatars available in ClothingB. ClothingB requests the 5G system for the avatar-related information of Shaun.
5. The 5G system accepts the request and exposes the selected avatar-related information in Shaun’s user profile to ClothingB. The decision of what information to be exposed is subject to user consent and service agreement between third parties and T. As pre-agreed by ClothingA, the information related to the avatar stored in its database is exposed to ClothingB. The information is then provided to Shaun, based on which Shaun decides to reuse the avatar stored in the ClothingA database.
6. ClothingB sends a request for Shaun’s avatar to the 5G system. The 5G system authorises the request and provides ClothingB with the IP address of the avatar.
7. ClothingB retrieves the avatar from the ClothingA database using the given IP address. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.15.4 Post-conditions | Shaun tries on products in the ClothingB virtual store with his avatar.
Shaun’s user profile on T’s system is updated to record the use of his avatar in the ClothingB virtual store.
T charges ClothingA, ClothingB, and Shaun for supporting virtual shopping sessions. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.15.5 Existing features partly or fully covering the use case functionality | The functional requirements for user identity are captured in TS 22.101 clause 26a [4]. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.15.6 Potential New Requirements needed to support the use case | [PR 5.15.6-1] Subject to user consent, operator policy, and regulatory requirements, the 5G system shall be able to store and update the information related to digital representations for a user (e.g. last access time and address).
[PR 5.15.6-2] Subject to user consent, operator policy, and regulatory requirements, the 5G system shall support mechanisms to expose the information related to the digital representations of a user to a trusted third party.
[PR 5.15.6-3] Subject to user consent and operator policy, the 5G system shall be able to authorise a trusted third party to use the digital representations of a user. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.16 Use Case on virtual store in a mobile metaverse marketplace | |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.16.1 Description | 5G technologies especially the XR communication technologies make it possible to run business online (with or without physical offices/stores) to offer various services. Examples include fashion store online, drop-shipping business, virtual real estate agency, virtual assistants, teaching an online course, and online fitness training. Running a business online is particularly attractive for start-ups and small even medium sized businesses. For end consumers, visiting a market is a real feast for the senses. It is a multisensory experience that combines tradition with human contact and involves a series of decisions about which products to choose for the shopping basket.
Mobile metaverse services are expected to help to transfer a world so rich in sensations to the virtual sphere, which offer rich XR enabled multimedia communication services together with security mechanisms for data protection, user identity/profile management as well as digital asset management and protection. IMS based avatar calls are among these new features/services, where AI avatar can be used to help facilitate social interactions. An AI avatar [61] [62] is a digital character powered by artificial intelligence, which lives in a virtual setting, like a game, social network or online world. More frequently avatars are designed as human-like bots that can be controlled by real human using AI technologies and can easily engage with real humans and maintain relationship - to varying degrees - with them. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.16.2 Pre-conditions | Magnificent Muggles, a niche fashion company, has set up virtual stores in a metaverse marketplace (a mobile metaverse service) provided by GreenMobile. The corresponding 5G communication subscriptions provided by GreenMobile include the XR enabled multimedia communication services, including IMS based avatar calls) which enable the efficient near-real life interaction between the virtual shop assistant and the online shoppers, an immersive location agnostic service experience. As a fashion retailer, Magnificent Muggles expect to provide similar if not better experience to their online shoppers in their virtual stores. To enable this, they have their shop assistants equipped with XR devices to interact with online buyers. In their virtual stores, the end consumers can “see” the products as if they were buying them face-to-face and they can interact live with the people selling them.
As part of the service level agreement, GreenMobile provides storage and communication services to Magnificent Muggles:
- to store the digital representations (e.g. avatars) for the virtual shop assistance;
- to assist the authentication of their employees to use the digital representations (e.g. avatars) in the XR communication when assisting online buyers;
- to render the digital representations (e.g. avatars) based on the voice, facial expression or body motion of a human user.
The service flows below illustrate how the virtual shop assistant and an online shopper interact with each other using services provided by 3GPP system. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.16.3 Service Flows | 0.1 Magnificent Muggles register with GreenMobile (the provider of 5G communication services and the mobile metaverse service) the digital representation (e.g. avatar) to be used by the virtual shop assistants. Subject to regulatory requirements, the digital representation are then certified to be legally used in a certain region. The digital representations are stored at GreenMobile’s edge sites.
0.2 Mrs. Dursley, an end consumer, registers and stores her digital representations with GreenMobile to be used in the metaverse marketplace. Subject to regulatory requirements, the digital representations are then certified to be legally used in a certain region.
1. Humphrey is the shop assistant in the Magnificent Muggles virtual store. Due to the ongoing pandemic he works from home. As part of the security requirements, he needs to be verified as an employee of Magnificent Muggles before having an XR communication with online buyers.
2. Mrs. Dursley decides to pay a visit at the virtual store of Magnificent Muggles to check out the new clothing. Humphrey, the shop assistant, greets her and offers to set up an XR communication to show her around the new lines. Mrs. Dursley thinks a good idea and agrees. Having completed the authentication of the participants, the multimedia communication session is set up between Mrs. Dursley and Humphrey as well as the associated XR devices.
3.1 For this session Humphrey uses one of the Magnificent Muggles registered digital representations. During the session, the terminal sends the audio and video data to the network. The network does the rendering of the digital image based on the voice, facial expression as well as the gesture, then sends to Mrs. Dursley’s terminal.
1) The body motion or facial expressions of Humphrey are captured at UE1, which is transmitted to the network.
2) With the received information about the user’s motion or facial expressions, the network renders the avatar (the dynamic 3D object).
3) The media data (converted from the 3D object) is then transmitted to the recipient, UE2 of Mrs. Dursley.
4) The video image (with the rendered avatar) is displayed at the screen of Mrs. Dursley’s terminal.
Figure 5.16.3-1: An example of avatar call functional flow (image rendering at the network)
NOTE: it is also possible for UE1 to send video stream to the network, with which the network can render the avatar (the dynamic 3D object). This is particularly useful for UEs with limited capability.
3.2 Mrs. Dursley downloads, from the network, one of her registered digital representations to be used for the XR communication. During the session, the rendering is done at the terminal side. An example of the functional flow from Mrs. Dursley to Humphrey is illustrated in figure 5.16.3-2. Note that in this option the required 3D avatar model needs to be made available at all the recipients (UE1 in this option).
1) The body motion or facial expressions of Mrs. Dursley are captured at UE2, which is transmitted to the recipient via the network.
2) With the received information about Mrs. Dursley’s motion or facial expressions, UE1 renders the avatar (the dynamic 3D object). The video image (with the rendered avatar) is displayed at the screen of Humphrey’s terminal.
Figure 5.16.3-2: An example of avatar call functional flow (image rendering at the receiving side)
NOTE: this use case shows the example of rendering of the video-based avatar media to 3D avatar. It is also possible to render other media to 3D avatar. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.16.4 Post-conditions | The 3GPP system with a combination of various technologies offers the users an immersive shopping experience equivalent to a face-to-face purchase in a crowded market. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.16.5 Existing features partly or fully covering the use case functionality | The service requirements on IMS Multimedia Telephony Service and supplementary services are well documented in TS 22.173 since Rel-7, many of which have been implemented in stage-2 and stage-3 WGs. The requirements on 3GPP IMS Multimedia Telephony Service are captured in TS 22.261 [5] clause 6.39 as a result of Rel-18 work.
On the user identity related aspects, there are several features defined including:
- Support of Multi-device and Multi-Identity in IMS MMTEL service is captured in TS 22.173 clause 4.6 [3]:
The support of multiple devices is inherent in IMS. In addition, a service provider may allow a user to use any public user identities for its outgoing and incoming calls. The added identities can but do not have to belong to the served user. Identities may be part of different subscriptions and different operators.
- TS 22.101 [4] has specified in clause 26a a set of service requirements on User Identity:
Identifying distinguished user identities of the user (provided by some external party or by the operator) in the operator network enables an operator to provide an enhanced user experience and optimized performance as well as to offer services to devices that are not part of a 3GPP network. The user to be identified could be an individual human user, using a UE with a certain subscription, or an application running on or connecting via a UE, or a device (“thing”) behind a gateway UE.
Network settings can be adapted and services offered to users according to their needs, independent of the subscription that is used to establish the connection. By acting as an identity provider, the operator can take additional information from the network into account to provide a higher level of security for the authentication of a user.
The 3GPP System shall support to authenticate a User Identity to a service with a User Identifier.
- Clause 8 of TS 22.261 [5] specifies the security related requirements covering aspects such as authentication and authorization, identity management, and data security and privacy.
The functional requirement and performance KPIs in support of XR applications are mainly captured in TS 22.261 [5]:
- clause 7.6.1 AR/VR;
- clause 6.43 Tactile and multi-modal communication service;
- clause 7.11 KPIs for tactile and multi-modal communication service
In support of metaverse services, additional considerations need to be given on the following aspects:
- securely register and store the digital representations (e.g. avatars) for the users. The user could be an individual human user using a UE with a certain subscription, or an application running on or connecting via a UE, or a device behind a gateway UE. The user could also be a third party, which is typically an enterprise customer having service level agreement with the operator and interacting with the 3GPP network via an application server.
- assist the authorization of the use of third party’s digital assets (e.g. the digital representations (e.g. avatars) in the XR communication. The third party is also involved in the procedure to certify the user identity (e.g. an employee of the company).
- when required render the digital representations (e.g. avatars) based on the voice, facial expression or gesture in the live communication video |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.16.6 Potential New Requirements needed to support the use case | [PR 5.16.6-1] Subject to user consent, the 5G system shall support mechanisms to securely register, store and update the digital assets for a user.
NOTE 1: The user could be a human user using a UE with a certain subscription, or an application running on or connecting via a UE, or a device behind a gateway UE. The user could also be a third party, which is typically an enterprise customer having service level agreement with the operator and interacting with the 3GPP network via an application server.
[PR 5.16.6-2] Subject to regulatory requirements and operator’s policy, the 5G system shall provide suitable and secure means to allow trusted third-party to authorize the use of the digital assets (that belong to the third-party enterprise customer) by a user.
NOTE 2: In a typical example the user is an employee of the third-party enterprise customer.
[PR 5.16.6-3] The 5G system shall be able to collect charging information per UE for managing (e.g. register, store and update) the digital assets for an end user (e.g. typically a human user with a certain subscription).
[PR 5.16.6-4] The 5G system shall be able to collect charging information per application for managing (e.g. register, store and update) the digital assets for the third party (e.g. typically an enterprise customer having service level agreement with the operator).
[PR 5.
16.6-5] Subject to regulatory requirements and user consent, the 5G system shall support real-time transmission, between a UE and the network, of the body movement information (e.g. body motion or facial expressions) of a human user in order to ensure immersive voice/audio and visual experience.
NOTE 3: The body movement information (e.g. body motion or facial expressions) of a human user is used for rendering of the avatar of this user.
[PR 5.16.6-6] Subject to regulatory requirements, user consent and operator’s policy, the IMS shall support the capabilities of rendering the avatar based on the body movement information (e.g. body motion or facial expression) of a human user. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.17 Use Case on Work delegation to autonomous virtual alter ego | |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.17.1 Description | Artificial Intelligence (AI) is becoming more and more popular in many areas where especially humans cannot handle complicated tasks well (e.g., factory, vehicle, robot, mobile). This trend is likely to continue, and AI will be applied to even more areas. In addition to the rapid expansion of AI, the AI technology itself is also improving. AI that can express emotions like humans and AI that can communicate naturally are now emerging. Given these trends, AI could one day be used not only for industrial use cases, but also as our personal partner and personal assistant to perform many of the tasks around us.
This use case proposes a communication with an autonomous virtual alter ego, which is an AI-based digital representation acting autonomously on behalf of a user herself/himself in the mobile metaverse services. For example, user's autonomous virtual alter ego autonomously sends a mail to clients on user’s behalf. Also, the alter ego can autonomously communicate with the user, other physical users, and other alter ego by using the network capabilities based on the user’s 3GPP subscription. Therefore, the use of network by the alter ego has to be captured correctly by the network from charging point of view.
NOTE: The term "autonomous virtual alter ego" means an AI-based digital representation behaving autonomously on behalf of a user herself/himself in the mobile metaverse services.
All the experience and knowledge performed both in physical world and metaverse will be shared between the alter ego and its user, thus creating more than double the opportunities to play multiple roles simultaneously. This autonomous virtual alter ego concept aims to improve emotional well–being, health, and life satisfaction by enabling users to perceive many opportunities in life, such as balancing work and family and participating in many communities simultaneously. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.17.2 Pre-conditions | John has a UE which has connectivity to 5GS based on subscription to MNO and a contract with an autonomous virtual alter ego service provider. The service setting and parameters for this alter ego service are stored in the user’s subscription data.
John’s virtual alter ego has been trained by the autonomous virtual alter ego service provider, so he can enjoy the alter ego application via his UE.
There are two kinds of application servers. One is for alter ego application. The other is for other applications which the alter ego application connects for executing tasks.
The autonomous virtual alter ego service provider is trusted by MNO, and the alter ego can use the network capabilities autonomously based on user’s subscription to MNO.
NOTE: The autonomous virtual alter ego application server doesn’t always have connectivity to internet (e.g., the case that alter ego service is operated on edge servers). Therefore, there would be the case that alter ego application server connects to other application servers via 5GS not via internet.
The MNO offers a service enabler that allows John to request that the network limit how much resources his alter ego is able to consume on his behalf. The service enabler also provides John with storage space that he can use to store application specific data and information about himself in the network. John is able to configure the enabler to give the virtual alter ego limited access to John’s information. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.17.3 Service Flows | 1. John has a F2F appointment with his important client at client's office, so he cannot attend an internal web meeting scheduled on the same time. Then, he connects to the alter ego application via his UE and asks his alter ego to show up and participate in the internal web meeting on his behalf.
2. His virtual alter ego checks the network resources on 5GS and computing resources on the alter ego server. Then it judges whether the requested task is able to be process or not. The virtual alter ego checks with the enabler server to see what information about John it is able to access.
3. If the alter ego can complete the tasks, the information from the enabler server is used to train the alter ego and the alter ego starts the tasks. Otherwise, the alter ego proposes task examples it can do with current resources to him, so that John can reconsider the request. For example, the alter ego can only listen during the meeting and take some notes but can’t say anything. Once John and the alter ego agree on what task(s) the alter ego will perform, John silences, or turns his off his UE so that he can focus on the F2F meeting.
4. Before the alter ego attends the meeting, it accesses the web meeting server as John’s alter ego via internet or 5GS by receiving the permission from John and web meeting server. When the company internal information or some other information is needed for the meeting, the alter ego asks the enabler to permit access to the information or asks John to permit the access to them.
5. At the meeting, the alter ego explains something, makes some questions/comments to other attendees (including physical humans and other virtual alter egos).
6. After the meeting, the alter ego autonomously reports to John via 5GS by message or on call. He looks or listens to the report and returns feedback and new requests by message or on call if any.
7. When the alter ego communicates via 5GS, the charging information is collected and linked to user’s charging information to charge the user subscribing the alter ego service.
Figure 5.17.3-1: Alter Ego Service Flow |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.17.4 Post-conditions | After receiving the feedback from John, the alter ego is retrained. Then, the autonomous virtual alter ego becomes more accurate and finishes the tasks much more immediately. As a result, the time available to him in life is more than doubled. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.17.5 Existing features partly or fully covering the use case functionality | AIML model transfer frameworks documented in TR 23.700-80 [51] can be applied to this use case.
IMS MMTEL services documented in TS 22.173 [3] can be applied to this use case. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.17.6 Potential New Requirements needed to support the use case | [PR 5.17.6-1] The 5G system shall be able to provide a means for a subscriber to authorize a third party to use a subscriber’s digital representation (e.g., avatar) and to access multimedia communication services on behalf of the subscriber.
[PR 5.17.6-2] The 5G system shall be able to collect charging information associated with communication involving a digital representation associated with the subscriber. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.18 Use Case on virtual meeting room in financial services | |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.18.1 Description | Meeting rooms in banks provide a private place for customers and financial manager to communicate, financial manager can provide customized information on the financial products that suitable for the customers. Customers may find a dedicated room for consulting and signing contact safer and the user experiences are better. While meeting rooms are limited resource in bank and customers need to go to bank by themselves for consulting, which will take more time and resources. A virtual bank meeting room offered by a mobile metaverse service can solve this limitation.
The virtual banking space can be designed by the consumers based on their user preference. Consumers can be represented by their digital representations (e.g. avatars) as they use these mobile metaverse services. Consumers can have eye contact or observe each others' body movements in a virtual environment, generating a friendly face-to-face service experience. With this service option, bank branches are freed from physical limitations of space and location. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.18.2 Pre-conditions | Each user has a unique digital representation (e.g. avatar) in the mobile metaverse service. Bank R provides consumers a virtual bank as a mobile metaverse service as a location agnostic service experience. This service requires a high level of security in mobile communication as the content is sensitive. Users have their own digital representations (e.g. avatars) that they use to represent themselves when they use the mobile metaverse service, and these avatars are mapped with their real identification. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.18.3 Service Flows | 1. Frank is a very active user of mobile metaverse service X, he does his daily work and entertainment making use of this mobile metaverse service X using his avatar.
2. Bank R has a virtual branch offered as mobile metaverse service X, in which the bank provides financial services and can provides different financial products to consumers based on their individual preferences. Frank is a VIP customer of Bank R. Frank is considering to have some financial products and he needs to consult with a professional financial manager in the virtual bank.
3. Frank enters the virtual bank branch using his avatar, Bank R will identify the user Frank, represented by his digital representation (e.g. avatar,) and authorize these by means of the 5GS to make sure the real identification of this avatar is the same with Frank.
4. 5GS will inform Bank R that this avatar is authenticated and authorized to represent Frank, and this digital representations (e.g. avatar) is authorized to represent Frank to perform financial actions. Bank R receives this information and provides this digital representation (e.g. avatar) representing Frank access to a customized VIP consulting room. In this room, Bank R can provide consulting and financial services to Frank.
5. After the authorization, the 5GS will increase automatically update the security mechanisms (such as encryption algorithms) associated with the PDU session to guarantee the security of the communication services used to deliver this financial service. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.18.4 Post-conditions | Frank had a safe and realistic experience using his digital representations (e.g. avatar) in the virtual meeting room. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.18.5 Existing features partly or fully covering the use case functionality | None. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.18.6 Potential New Requirements needed to support the use case | [PR 5.18.6-1] Subject to operator policy and national or regional regulation, the 5G system shall support identification of digital representations (e.g. avatars) associated with users, for mobile metaverse services.
[PR 5.18.6-2] Subject to operator policy and national or regional regulation, the 5G system shall support different communication security mechanisms according to the security requirements of different services. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.19 Use Case on Privacy-Aware Dynamic Network Exposure in Immersive Interactive Experiences | |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.19.1 Description | With the proliferation of APIs in existing mobile applications already creating an extensive market for application exposure, API integration in emerging Metaverse applications and features is likely to emerge as a major functionality for enhancing experiences across extended reality functions that builds upon already-existing API development. Given the importance of consistent, reliable network access and the low-latency connections necessary to generate and maintain immersive experiences in Metaverse immersive experiences, one could reasonably expect the development of APIs supporting network exposure for configuring and optimizing network features for a diverse array of emerging functions in extended reality interactions. As 5G begins to support VR, AR, and MR interactions through the cellular network, questions surrounding the efficiency and trustworthiness of network exposure to application developers abound.
In particular, the exposure of network characteristics through and the development of network-focused applications raises important questions around the privacy of user data with respect to the use of sensitive data around their internet usage, which could potentially reveal personally identifiable information about their location, environment, behaviour, or specific activities through such exposure. This concern extends beyond industry best practices and into emerging requirements from regulations such as the GDPR [52], CCPA [53], and other emerging national and international privacy regulation frameworks which specify the right of individuals to privacy across the lifecycle of data that could reveal personally identifiable information across a broad specification of contexts. It is thus incumbent on this body to proactively standardize the privacy features of the emerging 5GS in the context of APIs to ensure that such network exposure in application contexts does not expose providers or users to undue risks or liability. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.19.2 Pre-conditions | The following pre-conditions and assumptions apply to this use case:
1. Jenna is developing an application that uses potentially personally identifiable information.
2. Jenna is aware of the existence and relevance of tuneable network characteristics to improve or augment an immersive experience, e.g., sufficient tools exist to modify characteristics like streaming bitrate in immersive contexts.
3. Jenna has access to exposed APIs allowing her to deploy these features in relevant experiences for immersive interaction. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.19.3 Service Flows | 1. Jenna develops an application that uses sensitive data, e.g., an application that uses the real-time location and/or environmental features of users’ appearance and surroundings to generate a personal digital representation (e.g. avatar) in a mobile metaverse service activity.
2. Jenna uses an API exposing tuneable network characteristics to carry out some function, e.g., dynamically adjust the streaming resolution of generated mobile metaverse media (e.g. avatar/hologram,) or the streaming bitrate of the mobile metaverse media (e.g. avatar) in motion, based on higher-level network characteristics accessible in real time through the API.
3. Jenna develops an application that sends user information through the application to the network provider. Jenna does so in a way that is compliant with existing privacy transmission, storage, and processing standards. This means that Jenna’s application considers relevant privacy-preserving features such as informed consent to process, transmit, store, and appropriately delete any personally identifiable information collected and ingested during the flow.
4. The application uses this information to optimize a network-level feature such as streaming bitrate corresponding to a tuneable knob through the API. The network provider also considers relevant privacy-preserving features ingested as part of the data exchanged during this process.
5. When ingesting potential personally identifiable information at the network and/or application level, application provider, user, and network provider receive transparent, verifiable guarantees that data has been processed, stored, and transited in compliance with existing regulations within the user’s jurisdiction |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.19.4 Post-conditions | 1. Jenna’s digital representations (e.g. avatars) and other personally identifiable information generated through her application are able to safely exchange information through network exposure APIs without compromising the privacy of users or the network.
2. Network providers remain compliant with existing privacy regulations and best practices. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.19.5 Existing feature partly or fully covering use case functionality | Not applicable. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.19.6 Potential New Requirements needed to support the use case | [PR 5.19.6-1] Subject to national/regional regulations, and user consent, the 5G System shall be able to process and expose information from UEs related to user’s location, user’s body, and user’s environment, e.g., user’s home, user’s immediate vicinity.
NOTE: This requirement does not affect the ability of regulatory services, e.g., legal intercept service, to access such information without consent of the user. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.20 Use Case on Immersive Tele-Operated Driving in Hazardous Environment | |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.20.1 Description | Operating vehicles, lifting devices, or machines in an industrial environment is hazardous when achieved manually and locally by a human. Depending on the environment, operators are exposed to dangerous material, toxic fumes, extreme temperatures, landslide risks, radioactivity, etc.
AGVs already exist, although it is expected that human operators can take remote control to remotely operate such moving vehicles.
In this use case, it is proposed to leverage 5G to provide an end-to-end system in which a remote user controls a moving device (vehicle, lifting device, robot, etc.) with an immersive cockpit displayed on a virtual reality head-mounted display and haptic gloves for control. Furthermore, the cockpit is complemented with information from the digital twin of the place in where the user operates (e.g., sensors in a factory, type of material around, other moving vehicles or persons).
The use case improves user safety and makes the operations even more accurate by merging additional information from a digital twin. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.20.2 Pre-conditions | Bob works in a seaport; he operates a lifting device. The place in which he is operating is surrounded by cranes, machines, containers, pipes, and barrels containing hazardous substances.
A new mobile metaverse service is available: instead of locally controlling the device, Bob is installed in a safe remote location from which he is working. The surrounding information is available through a digital twin of the seaport and can come from various sources (IoT sensors, CCTV cameras, connected machines, and other vehicles).
In order to maximize Bob’s efficiency, the metaverse service experience delivered by the system is real-time with non-noticeable latency. This use case includes both location related and location agnostic service experience examples.
The mobile metaverse service Bob uses for teleoperation is running on a mobile metaverse server. In addition, Bob is equipped with a head-mounted display and haptic gloves to remotely control the vehicle. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.20.3 Service Flows | 1. This morning, Bob stayed home as his boss informed him about a potential hazard at the factory that was identified through some sensor on a pipe. Unfortunately, the exact nature and location of the hazard on the pipe are not known. So, Bob decides to remotely inspect the factory before his boss and local public authorities arrive to check.
2. He puts on his head-mounted display on which a cockpit environment is displayed from the mobile metaverse server: a virtual control panel appears in front of him. He can see his hands and the control panel in the cockpit. Bob’s application is connected to the mobile metaverse server which enables him to use the service.
3. Bob can tell the mobile metaverse server to configure which surrounding information from the digital twin he wants to monitor. He decides to focus on the 3D representation of the pipe and get real-time sensor information from it, as well as live data from the ambient temperature and gas sensors. The mobile metaverse media displays additional predicted data that temperature is growing, gas concentration is increasing, and that there is a high risk of explosion in less than 10min if this continues. This surrounding information is integrated with other display elements in the cockpit, but he can anchor it in his FOV.
4. While driving along the seaport by remotely controlling the lifting device via its digital twin in the metaverse server, Bob can also see the (hidden) content of other pipes. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.20.4 Post-conditions | Thanks to the 5G mobile metaverse “Tele-operated Driving” service, Bob has been able to drive the vehicle remotely in a reactive way avoiding dangers and finding the leak with the help of the information provided via the digital twins. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.20.5 Existing feature partly or fully covering use case functionality | The use case related to traffic flow simulation in clause 5.2 already provides requirements and KPIs related to the operation of a moving UE, similar to an AGV. However, that use case does not envision the use of remote control, e.g., using haptic devices and HMD, which trigger new requirements.
The use case related to critical healthcare services in clause 5.10 captures the usage of HMD and haptic devices with related requirements and KPIs, which can be generalized to industrial operations. However, this use case does not consider time-critical decisions based on surrounding moving objects in an open area. Neither it relies on real-time digital twin updates to track the characteristics of the environment (e.g., information about pipe content, etc.) |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.20.6 Potential New Requirements needed to support the use case | [PR 5.20.6-1] The 5G system shall be able to provide a means to associate data flows related to one or multiple UEs with a single digital twin maintained by the mobile metaverse service.
[PR 5.20.6-2] The 5G system shall be able to provide a means to support data flows from one or multiple UEs to update a digital twin maintained by the mobile metaverse service.
[PR 5.20.6-3] Subject to regulatory requirements and operator’s policy, the 5G system shall be able to support data flows directed towards one or multiple UEs as a result of a change in a digital twin maintained by the mobile metaverse service, so that physical objects could be affected via actuators.
NOTE 1: How an application actually operates on physical objects upon receiving a command via the mobile metaverse service, e.g. using actuators, changing environmental controls configuration, etc is out of scope of the 5G system. In addition, regulations and/or other standards could apply to remote operations (e.g. based on a specific industry).
[PR 5.20.6-4] The 5G system shall be able to support the following KPIs for remotely controlling physical objects via the mobile metaverse service.
Use Cases
Characteristic parameter (KPI)
Influence quantity
Max allowed end-to-end latency
Service bit rate: user-experienced data rate
Reliability
Area Traffic capacity
Message Data Volume (bits)
Transfer interval
Position accuracy
UE speed
Service Area
Remarks
Metaverse-based Tele-Operated Driving
[100] ms [25] (NOTE 1)
[10~50 Mbit/s] [25]
99%
[25]
[~360 Mbit/s/km2 ]
(NOTE 4)
~8Mbps video stream. Four cameras per vehicle (one for each side): 4*8=32Mbps.
Sensor data (interpreted objects).
Assuming 1 kB/object/100 ms and 50 objects: 4 Mbps [25]
20~100 ms [25]
(NOTE 2)
[10] cm [25]
[10-50] km/h (vehicle) [25]
Stationary/Pedestrian (user)
Up to 10km radius [25]
(NOTE 3)
UL (NOTE 5)
[20] ms [25]
[0.1~0.4 Mbit/s] [25]
99,999% [25]
[~4 Mbit/s/km2 ]
(NOTE 4)
Up to 8Kb
per message [25]
20 ms [25]
(NOTE 2)
[10] cm [25]
[10-50] km/h (vehicle) [25]
Stationary/Pedestrian (user)
Up to 10km radius [25]
(NOTE 3)
DL (NOTE 5)
1-20ms
(NOTE 6)
16 kbit/s -2 Mbit/s
(without haptic compression encoding);
0.8 - 200 kbit/s
(with haptic compression encoding)
(NOTE 6)
99.999%
(NOTE 6)
[~20 Mbit/s/km2 ]
(NOTE 4)
2-8 (1 DoF) (NOTE 6)
Stationary/Pedestrian (user)
Up to 10km radius [25]
(NOTE 3)
Haptic feedback
NOTE 1: The end-to-end latency refers to the transmission delay between a UE and the mobile metaverse server or vice-versa, not including sensor acquisition or actuator control on the vehicle side, processing, and rendering on the user side (estimated additional 100ms total). Target e2e user experienced max delay depends on reaction time of the remote driver (e.g. at 50km/h, 20ms means 27cm of remote vehicle movement).
NOTE 2: UL data transfer interval around 20ms (video) to 100ms (sensor), DL data transfer interval (commands) around 20ms.
NOTE 3: The service area for teleoperation depends on the actual deployment; for example, it can be deployed for a warehouse, a factory, a transportation hub (seaport, airport etc.), or even a city district or city. In some cases, a local approach (e.g., the application servers are hosted at the network edge) is preferred to satisfy low latency and high-reliability requirements.
NOTE 4: The area traffic capacity is calculated for one 5G network, considering 4 cameras + sensors on each vehicle. Density is estimated to 10 vehicles/km2, each of the vehicles with one user controlling them. [25]
NOTE 5: Based on [25]. UL is real-time vehicle data (video streaming and/or sensor data), DL is control traffic (commands from the remote driver)
NOTE 6: KPI comes from [5] cl 7.11 “remote control robot” use case
Table-5.20.6-1: Key Performance Indicator (KPI) for mobile metaverse Tele-Operated Driving |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.21 Use Case on Virtual Emergency Drill over 5G Metaverse | |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.21.1 Description | An Emergency Drill is a crucial activity for governments, local municipalities, and citizens to prepare for potential disasters such as earthquakes, fires, and floods. To make the drills more effective, it is important for a wide range of people, organizations, and government entities to participate and create simulations that are as close to real-life disaster scenarios as possible. The use of a metaverse environment is expected to significantly enhance the value of these drills. With the ability to provide a more realistic experience, the Emergency Drill in the metaverse is expected to not only improve response to direct damage from emergencies, but also provide valuable data on human thoughts, decisions, and actions in actual crisis situations.
It is also important for mobile operators to anticipate traffic patterns related to confirming people's safety or evacuation actions during an emergency, and take measures to address potential data traffic congestion, overload, or failure of base stations or network equipment. The mobile network operator should be prepared not only for disasters but also large-scale network failures. They has to be able to quickly and accurately assess the extent of damage and impact and take timely action to recover their networks. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.21.2 Pre-conditions | City A, known for its beautiful beaches, attracts many visitors each year. However, it is located near the sea and is at risk of suffering significant tsunami damage in the event of a major earthquake. With the challenges of providing rapid evacuation guidance for residents, saving lives, and restoring infrastructure, City A holds an annual comprehensive emergency drill. Although the drill is typically held on a holiday, the number of participants has been decreasing in recent years due to work, leisure, or COVID-19. This year, City A has decided to conduct the emergency drill in the metaverse environment to address this issue. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.21.3 Service Flows | 1. City A is planning a virtual emergency drill that participants can access from any location, such as their office, home, or even the beach.
2. Mobile operator B will provide the 5G system and anticipated operational and maintenance data for the emergency drill in the metaverse environment.
3. In the metaverse environment, a virtual disaster, such as an explosion of Mt. Fuji, is simulated, and participants, including citizens, organizations, and governments, will immediately respond by assessing the damage, conducting evacuation and rescue activities, and taking other necessary actions in the virtual space.
4. City A and designated organizations will collect various types of data during the emergency drill.
5. Additionally, Mobile operator A will collect data in the virtual network environment, taking into account actual operational and maintenance data from the real environment, such as UE mobility, overload, and out of coverage, to evaluate the impact of the network in the event of a disaster and implement necessary countermeasures in the virtual environment. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.21.4 Post-conditions | By participating in emergency drills, citizens and organizations learn how to respond to disaster scenarios, such as evacuations and rescues, and this information can be incorporated into local government disaster preparedness plans. Additionally, mobile operators can take effective measures to counteract potential network failures and other adverse impacts. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.21.5 Existing features partly or fully covering the use case functionality | No existing features are identified. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.21.6 Potential New Requirements needed to support the use case | No potential new requirements have been identified. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.22 Use case of Mobile Metaverse Live Concert | |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.22.1 Description | Mobile metaverse services allow people to enjoy an online digital concert with their avatars beyond the limitation of time and space. In order to provide immersive interactive location agnostic service experience to mobile metaverse service customers, large amount of computing resouces is needed to perform real-time processing for audio, video, and interactive data, etc. The thing to realise here is that different customers will use terminals e.g. XR glasess with different brands and different processing capabilities, some of the glasses will not have enough computing resources to perform the real-time rendering. Through split rendering, most of the computing work task can offload to the network, the high speed and low latency transmission provided by 5G system can cooperate with the edge cloud side for real-time rendering, and combine with the local optimized rendering of the XR terminal side to provide the immersive and unbounded XR experience. In addition to this, similar to the real world, people are more likely to watch a concert together with their friends, further, the mobile metaverse live concert service is also provides private boxs for group of avatars to enjoy the concert privately, and different types of social authority can be provided in the private box on demand. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.22.2 Pre-conditions | 1) Alex, Bob and Carey are good friends live in different cities, they agree to watch the mobile metaverse live concert together.
2) Alex, Bob and Carey are equipment with XR glasses and tactile, the equipment can capture their voice, facial expression, pose information to generate avatars and interact with the whole live concert.
3) Enough computing resources can be provided to the mobile metaverse live concert service and the 5G network is capable of providing sufficiently high throughput and low latency network transmission. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.22.3 Service Flows | 1. Alex, Bob and Carey’s subscribe to the mobile metaverse live concert services and order a “private box” which can only be used by themselves. Alex, Bob and Carey will be represented at the virtual concert event by their avatars.
2. Due to extensive computing resource requirement for image rendering in the interactive live concert, and Alex and Carey’s glasses are not strong enough to perform this processing, the UEs negotiate with the mobile network operator to offload the rendering service to the edge cloud. Carey's glasses can only receive the rendered image and show it in the field of view. Bob’s glass is more advanced, which can render the image itself.
3. The concert begins and the three friends access the mobile metaverse service. The live singer is presented to the audience as her own avatar. During the show, the singer and all the audience will be represented by their own avatars in the virtual space. Alex, Bob and Carey can adjust their own visual perspective, such as panoramic view, close range or even backstage. At the same time, they can also view the singer's voice and movement, and immerse themselves in the concert.
4. At the same time, extra VIP services can be provided in the “private box”, e.g. to chat with other audience members in the private box without being overheard. The virtual singer may also enter the "private box" to hold a personal meeting with her selected fans. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.22.4 Post-conditions | The consumers in the mobile metaverse live concert service enjoy a great immersive experience and socialize with their friends.
The 5G system is capable of supporting the communication required by the immersive mobile metaverse live concert service. Some extra edge computing services are also provided to some consumers whose equipment has insufficient computing capacity. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.22.5 Existing feature partly or fully covering use case functionality | The functional and performance requirements for AR/VR services have been captured in TS 22.261 clause 7.6. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.22.6 Potential New Requirements needed to support the use case | [PR 5.22.6-1] Subject to operator policy, the 5G system shall be able to support avatar-based multiparty communication in mobile metaverse service. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.23 Use Case on cooperation between metaverse and network using interactive XR | |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.23.1 Description | The mobile metaverse allows users to access an endless virtual world at anytime and anywhere through their terminals. The mobile metaverse are expected to behave as the real world, which means in addition to rendering a virtual environment like the physical world, the perceived spatial-temporal consistency is also the key point to achieve an immersive location agnostic service experience.
In mobile metaverse, spatial-temporal consistency for single user could mean, for example, dropping a virtual pen and seeing this pen fall subsequently. While for multiple players, this consistency could mean, for example, that one person cuts down a tree and other people see the tree falling down. This user experience requires the motion-to-photon latency in the range of 7 ms to 15ms [5] at least for a single user viewing the consequence of her own actions. Immersive VR requires the delivery of massive amount of data (in the order of Gigabyte) at ultra-low latency (less than 20 ms) [54].
NOTE: For location agnostic service experience involving multiple users who are not in the same location, the requirements above do not apply, since the service can impose ordering and timing of representations of virtual events in an arbitrary manner.
It should be noted that the computation resources for rendering involved in the mobile metaverse is different from the cloud gaming and traditional VR. For example, running a typical massively multiplayer online game today requires multiple tera FLOPS of graphics horsepower, and the demand is expected to grow by two orders of magnitude to create fully immersive mobile metaverse experiences. [55]
For the mobile metaverse world, distributed computation is an inevitable processing mode, so the selection of proper servers and data centers should consider the requirements of network delay, processing delay, storage and computation resource. The goal is to minimize the user's perception of delay.
Therefore, in order to obtain consistent experience in mobile metaverse service anytime and anywhere, deep collaboration between mobile metaverse and 5G network is needed. The potential collaboration aspects may include caching location, computation location, communication path, traffic scheduling and resource allocation in network. For example, when a service request emerges, the network control policy needs to coordinate the selection of (i) caching locations to provide digital objects, (ii) computation locations to execute service functions, and (iii) communication paths to route all associated data streams, jointly optimized with dynamic decisions on (iv) traffic scheduling and (v) resource allocation at all network locations. [55] |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.23.2 Pre-conditions | There are about 12,000 players sign up for a popular game and appear simultaneously in a specific setting, such as Eve Online in 2021. Due to the limitations of the existing server processing, it is not possible to support such a large number of high concurrency, so the network and application server need to cooperate to support the distribution of visitors to other servers while ensuring low latency requirement by XR applications. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.23.3 Service Flows | 1. Bob is a player who attends a popular AR interactive game and gathers with others in a shared environment, they are aware of each other’s action so that they need high synchronization.
2. The service provider will provide deployment information of each server to the 5G network, and request the Bob’s physical location and transmission delay in 5G network.
3. According to the cooperation agreement with application, 5G network will expose information to service providers, including the physical location and network delay of specific terminals or a group of terminals. The network delay includes the delay inside 5G system (UE to PSA UPF) and the latency information between PSA UPF and some potential servers.
4. The new server is selected by the service provider according to the UE location, network delay, business requirements, computation resource and storage resource of application servers. The decision result will be sent back to 5G network. Then the 5G network can then formulate corresponding policies for the service flows.
5. The content information will be synchronized to the new server in real time. 5G network should support the ultra-low latency data transmission, potentially among multiple operators. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.23.4 Post-conditions | Bob will have a good experience in this interactive AR game. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.23.5 Existing features partly or fully covering the use case functionality | 3GPP started the work of edge computing from R15 to R18. In R15, AF influence mechanism is introduced to inform the 5G network of the application deployment information to assist UPF selection. In R16, 5G system supports QoS monitoring mechanism for end-to-end delay monitoring for URLLC services. In R17, 5GS supports to solve the problem of edge DNS selection and service migration between different edge platforms. In R18, the work focuses on the edge computing platform access from other operator network, and the distribution of network policies for a group of local UEs. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.23.6 Potential New Requirements needed to support the use case | No potential new requirements have been identified. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.24 Use Case on Authorization of Avatar Usage rights | |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.24.1 Description | In metaverse, digital humans (avatars) are widely used in business activities, such as advertising, news reporting, live shows. With the maturity of digital human technologies and the continuous growth of market demands, lifelike avatars have become reality in recent years. In the future, more and more people are expected to use their own avatar to participate in business activities in the virtual world. Especially, celebrities, famous professors and other people with special social positions have influences also in the virtual world. In some scenarios, authorization of avatar usage rights is needed for commercial or other purposes. If there were no proper management of avatar usage rights, it could cause the spread of false information, even result in chaos in virtual world.
Therefore, the 5G system needs to support management and authorization of avatar usage rights. The owner of the avatar is expected to be responsible for the speech and behavior of his/her avatar. An individual or an enterprise has to be authorized by the owner of an avatar before using the avatar especially in business activities. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.24.2 Pre-conditions | Singer J has her own lifelike avatar, which is used for her live concerts in metaverse. Her touching voice has won hundreds of millions of fans. Company A is a clothing manufacturer. Seeing the commercial value of Singer J, the company invited her to be the company’s brand ambassador, who helps to increase brand awareness and attends product promotion activities. Singer J has signed one-year business contract with Company A. Subject to the contract, Singer J’s avatar is the brand ambassador for Company A in metaverse.
MNO B provides management services (including authorizing and deauthorizing) for the use of avatars in the mobile metaverse services. Each avatar has been assigned a unique identification code in MNO B’s management system. MNO B also provides avatar storage services.
Singer J is one of the subscribers of MNO B, and Company A is also served by MNO B. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.24.3 Service Flows | 1. Company A registers with the MNO B as an enterprise customer, while Singer J registers as an individual customer of MNO B. MNO B assigns IDs for Company A and Singer J respectively.
2. Singer J registers her personal avatar with MNO B. The avatar is lifelike and mapped to Singer J’s ID in real world. So MNO B identifies and stores the avatar and its ID, also associates with the avatar’s owner, Singer J’s ID.
3. Company A sends a request for the usage rights of Singer J’s avatar that is managed by MNO B.
4. MNO B sends a request to Singer J to ask for authorization of the avatar to be used in the mobile metaverse services.
5. After being confirmed by Singer J, Company A’s usage rights of avatar has been authorized. MNO B updates the system with the information that Company A has the usage rights of Sing J’s avatar. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.24.4 Post-conditions | Upon authorization, Singer J has granted the usage rights of her avatar to Company A for one year to be used in mobile metaverse services. During this year, Company A is authorized to use Singer J’s avatar for business purposes. When Company A wants to use Singer J’s avatar in a mobile metaverse service, the 5GS searches the avatar by its ID, and pushes the requested avatar to company A. At the end of the year, Company A’s usage rights of Singer J’s avatar will be duly terminated. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.24.5 Existing features partly or fully covering the use case functionality | None. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.24.6 Potential New Requirements needed to support the use case | [PR 5.24.6-1] Subject to regulatory requirements, user consent and operator’s policy, the 5G system shall support mechanisms to identify an avatar and associate the avatar with a subscriber (i.e. the owner of the avatar).
[PR 5.24.6-2] Subject to regulatory requirements, user consent and operator’s policy, the 5G system shall be able to authorize the avatar to be used in mobile metaverse services.
[PR 5.24.6-3] Subject to regulatory requirements, user consent and operator’s policy, the 5G system shall provide time-bound authorization services for an avatar to be used in mobile metaverse services.
[PR 5.24.6-4] Subject to regulatory requirements, user consent and operator’s policy, the 5G system shall be able to support mechanisms to manage the authorization information about the use of an avatar in mobile metaverse services (e.g. the applied time-bound authorization services, the authorized users).
[PR 5.24.6-5] Subject to regulatory requirements, user consent and operator’s policy, the 5G system shall be able to identify the subscriber who has the right to use an avatar in mobile metaverse services. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.25 Use Case on Enabling Metaverse services to users via multiple access connections | |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.25.1 Description | The metaverse enables immersive virtual media, 3D avatar and holographic communications for realizing use cases such as interactive gaming, virtualized shared workspaces, and immersive conference rooms for remote collaboration, etc. The goal is to create a virtual world we can work in, interact with, and even escape to. Many mobile metaverse use cases are applicable to indoor and/or localized areas such as home, offices, stadiums, shopping malls, movie theatres, theme parks, hospitals, universities, concert halls, etc. Even though metaverse services go beyond virtual reality media presenting virtual worlds that seem to be distant, the scenarios that this use case focusses on are tied to a single physical location which is mostly indoors and serving a localized area. Such physical locations may prefer non-3GPP (trusted, untrusted or wireline) access.
Some mobile metaverse services require more bandwidth and lower latencies which can be challenging to meet. Major improvements to satisfy these requirements of uninterrupted, lag-free, immersive mobile metaverse service experience using non-3GPP access have been made such as:
- incorporation of 1200 MHz of new spectrum in the 6 GHz band with Wi-Fi 6E enabling bigger channel sizes up to 160 MHz
- support up to 1024 QAM with Wi-Fi 6 and 6E and Wi-Fi 7 aiming to support up to 4096 QAM
- doubling maximum channel bandwidth available to each device to 320MHz in the 6GHz band with Wi-Fi 7
- incorporation of High Band Simultaneous (HBS) Multi-Link Operation (MLO) in 802.11be that aggregates two simultaneous 160 MHz channels (four streams) in 5 GHz and 6 GHz bands reducing latency to < 2msec
In case of converged or hybrid network architecture, a single mobile metaverse user can access mobile metaverse services via 5GS using both 3GPP and non-3GPP accesses simultaneously. In such scenario, the metaverse traffic would need to be synchronized as it may be subject to varying latencies when routed over both 3GPP and non-3GPP access. Alternatively, a single mobile metaverse service can be accessed by multiple users across multiple access networks from a network operator. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.25.2 Pre-conditions | Two friends and neighbors Mark and Bob want to enjoy their weekend using a Metaverse application for immersive gaming. Mark is using his home residential broadband (non-3GPP) while Bob is using the 3GPP access network from the same network operator. Both the 3GPP and non-3GPP access network are connected to the network operators 5GC. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.25.3 Service Flows | 1. Mark starts the immersive gaming via an authorized 3rd party metaverse application using the network operators broadband access network (non-3GPP access).
2. Bob joins the immersive gaming to play along with Mark via the same authorized 3rd party mobile metaverse services using the same network operator but connecting to the 3GPP access network.
3. Both Mark and Bob experience different network conditions, e.g., bitrate, reliability, latency across the access networks.
4. 5GS exposes varying network condition changes across the two access networks to an authorized 3rd party mobile metaverse service.
5. Based on the real-time network condition information shared by 5GS, the authorized 3rd party mobile metaverse service adjusts the requested QoS for both users for coordinated user experience.
6. Based on the requested QoS from the authorized 3rd party mobile metaverse service, the 5GS performs dynamic policy updates for the users to meet the desired QoS levels for the metaverse traffic and synchronizes the metaverse application data streams for both users using different access networks. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.25.4 Post-conditions | Both Mark and Bob accessing the same mobile metaverse service across different access networks from the same operator can get the same coordinated user experience even when experiencing different network conditions. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.25.5 Existing features partly or fully covering the use case functionality | The 5G system supports non-3GPP access concurrent with 3GPP access and traffic steering already. The traffic steering aspects are covered in TS 24.193.
It is already possible to expose QoS monitoring information to third parties, when using 3GPP access.
Dynamic QoS policy updates are also possible in the 5GS in the HPLMN and VPLMN.
In clause 6.43.2 of 3GPP TS 22.261, there are the following requirements:
The 5G system shall enable an authorized 3rd party to provide policy(ies) for flows associated with an application.
The policy may contain e.g., the set of UEs and data flows, the expected QoS handling and associated triggering events, and other coordination information.
The 5G system shall support means to apply 3rd party provided policy(ies) for flows associated with an application. The policy may contain e.g., the set of UEs and data flows, the expected QoS handling and associated triggering events, and other coordination information.
NOTE: The policy can be used by a 3rd party application for the coordination of the transmission of multiple UEs’ flows (e.g., haptic, audio, and video) of a multi-modal communication session. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.25.6 Potential New Requirements | [PR 5.25.6-1] Subject to operator policy and user consent, 5G system shall be able to provide means to expose network performance information (e.g., bitrate, latency) to an authorized 3rd party metaverse application.
NOTE: The network performance information can be per UE and take into account all available 5G access network types with the aim of improving user experience.
[PR 5.25.6-2] 5G system shall be able to provide means to enable authorized 3rd party to synchronize the metaverse traffic which is routed or steered over available 5G access networks. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.26 Use Case on IMS-based 3D Avatar Call Support for Accessibility Use Case | |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.26.1 Description | 3GPP has long standardized functionality to support availability of communication for users with disabilities. Global Text Telephony [56] provides a character-by-character text conversation to enable Global Text for those who rely on it, even for emergency service access. With the advent of speech recognition, it is possible to encode audio calls into text and text can be converted to speech. This kind of conversion goes a long way to achieve ITU-T SG16's Total Conversation vision: "Total Conversation is an ITU-T defined concept that encompasses voice telephony, video telephony and text telephony. The idea is that it gives everyone the chance to communicate with one another regardless of whether they are hearing, hearing impaired or deaf." [57]
There are a number of additional valuable scenarios that could be enabled through the use of IMS 3D Avatar Call, as described in 5.11.
Figure 5.26.1-1: Accessibility Scenarios for IMS 3D Avatar Call
In scenario 1, above, a hearing-impaired user communicates with another using signage. Each user's gestures as well as facial expression and movements are captured by sensors (e.g. these sensors could be part of the terminal equipment) and transformed into an avatar encoding before transmission to the conversational partner. The experience of both parties is natural, and the user experience should resemble that of a video call, albeit with 'idealized lighting and contrast' due to the animation.
In scenario 2, one person speaks while the other signs. The signage of the person on the right is captured as described in scenario 1, but in addition it is analyzed. Research results indicate the likelihood that it will soon be possible to reliably use AI-based programs to capture signage to generate text. [57] It is clear that text to speech is possible. Thus for the user on the left, they can see the person on the right signing and receive an audio rendering of the text they generate.
The speech of the user on the left can be converted to text by means of voice recognition. There is extensive research into text to signage as well as some commercial products already available in this area. It is therefore possible for the user on the right to both see the user on the left speaking, as well as an avatar providing signage, or even an avatar rendering of the user on the left performing signage.
In scenario 3, one of the users may not be able to use IMS 3D Avatar call, e.g. they use terminal equipment without this support. In this case, the user on the left enters text and this is rendered as an avatar signing for the user on the right, if this is desired. The user on the right can express herself using signing, which is captured as text (as described for scenario 2) and sent as GTT text media to the user on the left.
One element is currently not possible with text conversion to other media, be it speech or generated avatar media of signage: the timing and emotions expressed in the communication. As part of scenario 3, we consider the possibility of capturing specific text conventions to indicate speech pauses or emotions.
An additional consideration is that the display equipment used to present the IMS 3D Avatar call may either be a UE itself or a separate monitor that the UE is able to use or is available through the display connected another UE, as by Inter-Device Connectivity (a feature of IMS.)
Finally, the possibility to support a communicating user that is 'software generated' is supported well by this use case. In this case, a variant of scenario 2 could be used where the user on the left is in fact an automated customer support centre representative. The computer-generated speech is rendered as signage to the user on the right, and the signage of the user on the right is rendered as speech to the software-based customer service party. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.26.2 Pre-conditions | Both communicating parties AeCha, Bharathi and Carlos have a mobile subscription with PLMNs Absolute Telecom (PLMN A) and Benefit Wireless (PLMN B) and Celestial Cellular (PLMN C).
Both Arndt and Berndt have UEs that support sensors capable of capturing their facial expressions and movements as well as gestures sufficiently for this use case. They also are able to set their terminal equipment down so they have free hands (either on a tripod or table, etc.) Carl has a UE that is only capable of voice calls. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.26.3 Service Flows | Scenario 1: IMS 3D Avatar Call between two callers employing accessibility features and translation
AeCha calls Bharati. AeCha and Bharati use IMS 3D Avatar Call to communicate through sign language.
AeCha signs using Korean sign language. Bharathi signs using Hindi sign language. There are many forms of sign language in the world that are not mutually comprehensible internationally, and we assume that AeCha and Bharathi would not be able to understand each other's signage directly.
There are a set of services available in the communication channel between AeCha and Bharathi that enable the two of them can communicate.
Figure 5.26.3-1: IMS Avatar Call with Services for Signage to Text and Text Translation
AeCha's signing is captured by the sensors and encoded as Avatar Codec, capturing her use of Korean sign language. In the network, the signage is transcoded into Korean text. The Korean text can be translated into English text. This text can then be used to generate Indian Sign Language (shown in the transcoding function in PLMN B).
It is acknowledged that the translation services included in this use case are not exact, however the possibility to communicate directly using signage, and even with the avatar of the corresponding party could be quite valuable.
The avatars seen by AeCha and Bharathi are a representation of the other party, as sufficient information is exchanged by the 5G system to enable the transcoders that produce the avatar codec in PLMN A and PLMN B to do so.
Scenario 2: IMS 3D Avatar Call and Audio between two callers, with accessibility enhancements
Figure 5.26.3-2: IMS Avatar Call with Services for Signage to Text and Text to Voice
Carlos speaks. His speech is recognized (in a transcoder in PLMN C) and encoded as English Text. This text is transported as media. The text is transcoded (in PLMN B) to Indian sign language encoded in an avatar codec. Bharathi views an avatar signing, using Indian sign language to represent Carlos' speech. The avatar is not a representation of Carlos as there are no sensors capturing Carlos, unless there is a means to configure the transcoder in PLMN B with the avatar information corresponding to Carlos' appearance. This is out of scope of this use case.
Bharathi signs, and this is captured in an Avatar codec that identifies her gestures, facial expression and movements. This is converted to English text in a transcoder function in PLMN B. The English text is sent as media to PLMN C, where a transcoder converts the text to speech. This speech is transported as audio media to Carlos, who hears a synthesized voice expressing the communication that Bharathi signed.
Scenario 3: IMS 3D Avatar Call and GTT between two callers, with accessibility enhancements
Figure 5.26.3-3: IMS Avatar Call with Services for Signage to Text and Text to Voice
In this scenario, Carlos uses a GTT terminal to supply GTT media uplink. This media is converted in a transcoder to avatar codec representing signing in Indian sign language to Bharathi. The avatar is not a representation of Carlos as there are no sensors capturing Carlos, unless there is a means to configure the transcoder in PLMN B with the avatar information corresponding to Carlos' appearance. This is out of scope of this use case.
Bharathi signs, and this is captured in an Avatar codec that identifies her gestures, facial expression and movements. This is converted to GTT text in a transcoder function in PLMN B. The GTT media is delivered to Carlos, who reads text expressing the communication that Bharathi signed. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.26.4 Post-conditions | In each of three scenarios one or both parties are able to sign and see signage in their native sign language in order to communicate with the other party. The possibility to interwork with legacy GTT terminals and legacy audio terminals is also supported. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.26.5 Existing feature partially or fully covering use case functionality | The 5G system supports IMS which is able to handle diverse media, establish calls and support media codec transcoding services.
The 5G system supports GTT. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.26.6 Potential New Requirements | [P.R.-5.26.6-1]The 5G system shall support the encoding of sensor data capturing the facial expression and movement and gestures of a person, in a standard form, such that as part of the avatar encoding
[P.R 5.26.6-2] The 5G system shall support a set of transcoders from and to avatar representations e.g. between text, speech and avatar encoding.
[P.R-5.26.6-3] The 5G system shall support the avatar transcoding functionality to control the appearance of the avatar based on the preferences of its associated user Examples of the controlled appearance could be for the avatar to express behavior, movement, affect, emotions, etc.
[P.R 5.26.6-4] The 5G system shall support a set of transcoders to facilitate accessibility of avatar representation from and to GTT to control the appearance of the encoded avatar
[P.R. 5.26.6-5] The 5G system shall be able to collect charging information for transcoding services associated with IMS-based avatar call. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.27 Use Case on Localized Mobile Metaverse Overload | |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.27.1 Description | The mobile metaverse offering a location related service experience may reach its limits, as significant resource intensive communication is required to support uplink sensor data and downlink media for each user In a crowded environment, such as an amusement park, users may want to experience augmented reality in their local environment.
In this use case, Dream Park is a huge theme park in a city. This theme park has been in operation for several decades. The attractions (roller coasters, etc.) are no longer 'new' or 'state of the art.' In order to increase interest for visitors without upgrading the attractions, the owners now provide extensive virtual content for each location in the park. This allows customers to enjoy and experience the theme park’s thrilling rides in exciting new ways and to share the park with all sorts of animated characters and decoration.
Figure 5.27.1-1: A theme park that offers localized metaverse services
Visitors can select the type of experience they wish. If they do not buy the premium content they can still enjoy the 'brick and mortar' rides, and conditionally (that is, if there is no congestion,) also the premium content. Paid premium users (i.e. users who have purchased tickets to experience special augmented content) can enjoy the premium content at any time, even if there is congestion.
NOTE: This aspect of the use case is not further developed. It is assumed that the support of premium content can be supported in different ways using existing mechanisms.
There is general content that is provided to all visitors, for example, AR public safety messages and announcements. This class of content needs to be delivered very efficiently so it does not produce congestion, but it is not highly interactive or personalized for the specific viewer. This content still perfectly fits the context in which it is displayed, e.g. at the entrance to buildings or along a pathway.
In a major amusement park in 2019, there were an average of 119,000 visitors a day. The park has 2.023 km2 surface area. The resulting user density is 58,824 visitors per km2.
This use case considers how the 5GS can reasonably provide localized mobile metaverse services (AR that fits the location) even in high user density conditions. We will consider three aspects.
- Support for AR content communicated by mass distribution
Attributions for Figure 5.27.1-1.
The amusement park icon is available given creative commons license from thenounproject.com:
Amusement park - Created by Lars Meiertoberens from Noun Project
AR User, per creative commons.
The amusement park image is available at:
Amusement park image - Parque Salitre - Amusement park - Wikipedia: https://en.wikipedia.org/wiki/Amusement_park#/media/File:Parque_Salitre.JPG as per creative commons license. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.27.2 Pre-conditions | Ajay and Vijay also have subscriptions with the operator to receive XR multimedia communication service. They both have a mobile subscription to the local operator, Salvo Net.
Ajay has a premium ticket to the amusement park. Vijay has a normal ticket.
Dream Park offers mobile metaverse services to the park visitors by means of communication services from Salvo Net. They have arranged a specific network slice to suit their localized mobile metaverse services.
In this use case we do not assume that all content is 'all or nothing', that is, either one buys a premium ticket and gets the content, or one does not get any premium content at all. If there is sufficient capacity in the theme park, anyone can access the premium content. This ensure that the park will fill up every day! The availability of 'premium experiences' after a waiting interval gives an incentive to those who visit on weekdays, when there is bad weather, etc. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.27.3 Service Flows | Support for mass distribution of AR content
3.1 The amusement park network slice is 'congested' and there is limited access to premium content. Still, in any case there is 'general content' that has to be delivered to all visitors. This includes public safety announcement, so Dream Park considers the delivery of general content to park visitors crucial to support at all times.
3.2 The amusement park's mobile metaverse service requests exposed functionality of Salvo Net to deliver AR content to all visitors by means of efficient multicast or broadcast transmission, even though the density of visitors is very high (e.g. 60,000 per km2).
3.3 Salvo Net distributes the AR content as requested efficiently and avoiding as much as possible further congestion of the amusement park network slice. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.27.4 Post-conditions | As a result of support for mass distribution of AR content is delivered to all users in the park efficiently, even though there is very high user density.
Different mobile metaverse services are delivered to the user simultaneously, i.e. it is not necessary to deliver only one XR content at the same time. It is therefore necessary to ensure that different mobile metaverse servers can synchronize their delivery of content to prevent clashes in the presentation to the user. This is even more important if there are different mobile metaverse servers that produce different components of multi-modal media that has to be delivered to one or more users. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.27.5 Existing features partly or fully covering the use case functionality | The 5G system provides extensive support for mobile broadband communication and multicast and broadcast services.
The 5G system provides a means by which resources can be dedicated to multicast and broadcast services, so that these resources are dedicated, and do not diminish when the network is congested.
The 5G system supports network slices to provide services according to the requirements of customers who deliver services to mobile users.
The 5G system supports a means to support differentiated QoS policy for different subscribers who are using a particular service. The ARP parameter and other mechanisms for response to congestion in the 5G system cannot be set or otherwise influenced by a third party. There is no way for an AF to request a specific ARP be applied to the session. |
8fc4e7e237d7663b7a5c6a2b6436bde3 | 22.856 | 5.27.6 Potential New Requirements needed to support the use case | [PR 5.27.6-1] Subject to operator policy, the 5G system shall support mechanisms to expose functionality to a trusted third party to be able to select subscribers to whom mobile metaverse media can be distributed in a resource efficient manner.
[PR 5.27.6-2] Subject to operator policy, subject to user consent, the 5G system shall support efficient mechanisms to provide resource efficient communication of third party mobile metaverse media to one or more subscribers.
[PR 5.27.6-3] Subject to operator policy, the 5G system shall support a mechanism to enable multiple authorized third parties to synchronize media communications from multiple service data flows delivered to one or more UEs.
[PR 5.27.6-4] The 5G system shall be able to collect charging information associated with distribution of third party mobile metaverse media to one or more subscribers.
[PR 5.27.6-5] Subject to operator policy and regulatory requirements, the 5G system shall support a means by which an authorized third-party service provider can request differentiated handling of specific subscribers using the third party's service during network congestion. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.