hash stringlengths 32 32 | doc_id stringlengths 5 12 | section stringlengths 5 1.47k | content stringlengths 0 6.67M |
|---|---|---|---|
5eaf1b94c89939407995fc52470817be | 22.890 | 6.2.3 Service Flows | 1. A passenger is standing in front of a smart kiosk to search his train's platform.
2. The passenger's UE make a proximity connection with the kiosk or vice versa by using 5G Off-network communication.
3. The passenger sends identification information of the passenger (e.g. customer id) or the train.(e.g. train number).
4. The kiosk shows the path to the platform of the train with 3D or metaverse enabled map of the smart station and transmits the information to the passenger's UE.
5. The passenger gets physical ticket or receipt from the kiosk and the passenger is a transport vulnerable, asks guidance to the kiosk by the Mobile Intelligent Assistant, a.k.a. guidance robot of the smart station.
6. The kiosk asks the location of the Mobile Intelligent Assistant to the FRMCS, and arranges the closest one to the passenger and bring the Mobile Intelligent Assistant the passenger, for example, in front of the kiosk.
7. The Mobile Intelligent Assistant comes to the passenger and confirms the identification information of the passenger by connecting the UE via 5G Off-network communication of the passenger.
7. The passenger gets on the Mobile Intelligent Assistant and moves to the platform. |
5eaf1b94c89939407995fc52470817be | 22.890 | 6.2.4 Post-conditions | The passenger gets on the train and enjoys his/her journey.
The Mobile Intelligent Assistant is released from the passenger service and ready to organise other service. |
5eaf1b94c89939407995fc52470817be | 22.890 | 6.2.5 Existing features partly or fully covering the use case functionality | MCX location, identity and group management cover the related functionality of this use case. |
5eaf1b94c89939407995fc52470817be | 22.890 | 6.2.6 Potential New Requirements needed to support the use case | [PR-6.2.6-1] The 5G System shall support the proximity connections between the UE, the kiosk and the Mobile Intelligent Assistant.
[PR-6.2.6-2] The 5G System shall be able to provide functions to handle the smart station information for making display 3D or metaverse enabled smart station map and generating path information by a kiosk. |
5eaf1b94c89939407995fc52470817be | 22.890 | 6.3 Multiple concurrent mobility services | |
5eaf1b94c89939407995fc52470817be | 22.890 | 6.3.1 Description | In the Railway Smart Station, a transportation convenience service for the passengers with the reduced mobility can be feasible, such as a mobility service for the passengers to arrive at the desired destination.
Figure 6.3.1-1. Example of multiple concurrent mobility services |
5eaf1b94c89939407995fc52470817be | 22.890 | 6.3.2 Pre-conditions | 1. There exist feasible Mobile Intelligent Assistants in the smart station, where the Mobile Intelligent Assistants support 3GPP system.
2. The Mobile Intelligent Assistants are operated under the central control system via 3GPP access.
3. Each Mobile Intelligent Assistant supports corresponding mobility service.
4. There exist more than or equal to two mobility services, where each mobility service requires different location accuracy, and is supported by different Mobile Intelligent Assistant. |
5eaf1b94c89939407995fc52470817be | 22.890 | 6.3.3 Service Flows | 1. Two different mobility services are initiated by the central control system.
2. A Mobile Intelligent Assistant#1 and a Mobile Intelligent Assistant#2 move along the predetermined path. Here, each path is characterized by the representative location of corresponding zone.
3. The Mobile Intelligent Assistant#1 and the Mobile Intelligent Assistant#2 moves along the representative location of blue and red zone.
5. Two different mobility services are completed by the Mobile Intelligent Assistant #1 and the Mobile Intelligent Assistant #2, where the completion time can be different. |
5eaf1b94c89939407995fc52470817be | 22.890 | 6.3.4 Post-conditions | 1. Two different mobility services are supported in the Railway Smart Station. |
5eaf1b94c89939407995fc52470817be | 22.890 | 6.3.5 Existing features partly or fully covering the use case functionality | [R-5.11-001] The MCX Service shall support obtaining and conveying Location information describing the position of the MCX UE.
[R-5.11-002] The MCX Service should support obtaining and conveying high accuracy Location information describing the position of the MCX UE.
[R-5.11-002a] The MCX Service shall be able to provide a mechanism for obtaining high accuracy Location information by integrating position information from multiple external sources (e.g. magnetometers, orientation sensors, GNSS)
[R-5.11-003] The MCX Service shall provide for the flexibility to convey future formats of Location information.
[R-6.12-002] The MCX Service shall support conveyance of Location information provided by 3GPP location services.
Note: Please refer to TS 22.280 [7]. |
5eaf1b94c89939407995fc52470817be | 22.890 | 6.3.6 Potential New Requirements needed to support the use case | [PR-6.3.6-1] The MCX service shall support obtaining and conveying location information describing the positions of each MCX UE with different location accuracy simultaneously.
Note: The above requirement is intended to be included in Sections 5.11 and 6.12 of TS 22.280 [7]. |
5eaf1b94c89939407995fc52470817be | 22.890 | 6.4 Operation of platform screen doors | |
5eaf1b94c89939407995fc52470817be | 22.890 | 6.4.1 Description | For the safety of a platform, there is the need of screen doors placed on the edge of the platform to prevent dangerous situation when the train approaching the platform and passengers getting on and off the train. The screen doors are opened before the opening of train doors, and are closed after the closing of train doors. If there is emergency situation, the designated CCTVs are controlled to aim the emergency spot and rely the video of the spot to the train driver's monitors and the railway station staff's monitors including their UEs to assist their actions to cover the situation.
Figure 6.4.2-1 Operation of train and screen doors with CCTVs in a platform |
5eaf1b94c89939407995fc52470817be | 22.890 | 6.4.2 Pre-conditions | Some CCTVs are pre-designated to aim each part of the platform in case of emergency.
The CCTVs, train driver, the staffs of the station are pre-defined as a group for the emergency.
In the train, a Trainborne System is to control the train doors. The Screen Door Controller handles the screen doors in the platform of the station. Between the system and the doors, a synchronisation is maintained to open and close the train's doors and the screen doors at the same time.
During the train approaching the platform of station, the Trainborne System and Screen Door Controller checking that the train stops in the right place and aligns the doors well. |
5eaf1b94c89939407995fc52470817be | 22.890 | 6.4.3 Service Flows | 1. The CCTVs in the train and in the platform, start video-recording each doors and displays the videos on the Train Driver's monitors. If the Train Driver finds abnormal status during monitoring the videos from the CCTVs, the Train Driver could open or close all the doors in the platform and the train manually.
2. Trainborne System notices the Screen Door Controller to open the train doors.
3. Screen Door Controller announces the screen door opening to the passengers in the platform via displays that are attached each screen doors and/or speakers of the train and the platform.
4. Screen Door Controller opens screen doors and notices to the Trainborne System.
5. Trainborne System makes announcement for the train door opening to the passengers in the train.
6. Trainborne System opens the train doors.
7. During a pre-defined time, train door sensors and screen door sensors detect passenger's moving. During the passengers are getting on and off the train, passengers could keep the doors open through pushing an emergency button on the doors. In this case, a notice is sent to the Train Driver and the staffs of the station in the pre-defined group to let them know this situation, and the designated CCTVs are controlled to aim the emergency spot with the location information of the emergency button to assist the Train Driver and the staff to figure out the situation.
8. The time is up, Trainborne System and Screen Door Controller announce to passengers the doors are closing. During the doors are closing, the sensors in the doors detected passengers or obstacles such as bag and umbrella in the door area, the doors are stopped closing and re-opened by the Trainborne System and Screen Door Controller automatically, and the Train Driver is noticed the situation by the Trainborne System. CCTVs are aimed the location of the doors, video-records the situation and display it on the Train Driver's monitors. The Train Driver could select a CCTV from the list of CCTVs in the platform, handles the CCTV to get closer video manually.
9. Trainborne System notices the train doors closing to the Screen Door Controller and closes the train doors.
10. Screen Door Controller closes the screen doors and notices the completion of closing door to the Trainborne System.
11. Trainborne System notices ready-to-go to the Train Driver. |
5eaf1b94c89939407995fc52470817be | 22.890 | 6.4.4 Post-conditions | All the doors are closed and the train moves to the next station. |
5eaf1b94c89939407995fc52470817be | 22.890 | 6.4.5 Existing features partly or fully covering the use case functionality | The group management are fully covered by 5G system and MCX framework.
Note: Please refer to TS 22.281 V17.0.0.
[R-5.1.3.1.2-001] The MCVideo service shall provide a mechanism for an MCVideo user to remotely control a camera on another MCVideo UE subject to relevant authorization.
[R-5.1.9.2.2-001] The MCVideo service shall provide a mechanism for an authorized MCVideo User to push video to another MCVideo User.
[R-5.1.9.2.2-002] The MCVideo service shall provide a mechanism for an MCVideo administrator to authorize an MCVideo user to push a video to another MCVideo user.
[R-5.1.9.2.2-008] The MCVideo service shall provide a mechanism for an MCVideo User to suspend and to resume receiving an incoming video stream from an MCVideo push.
Note: Please refer to TS 22.280 V18.2.0 [7].
[R-5.1.1-002] The MCX Service shall provide a mechanism by which an MCX UE makes a MCX Service group transmission to any MCX Service Group(s) for which the current MCX User is authorized.
[R-5.1.1-006] The MCX Service shall provide a mechanism for a dispatcher or authorized user to configure which content source shall be able to transmit the content to an MCX Service Group (e.g. video cameras near an incident).
[R-5.21.1.2-004] The MCX Service shall provide a mechanism for an MCX User to request an authorized MCX User (e.g., a dispatcher) to send an MCX Communication (e.g., video or data) to the MCX UE (downlink pull). |
5eaf1b94c89939407995fc52470817be | 22.890 | 6.4.6 Potential New Requirements needed to support the use case | [PR-6.4.6-1] The FRMCS shall be able to provide a mechanism to trigger an emergency alert based on a combination of UE location (e.g. the location of the specific platform door / train door) and application-generated trigger (e.g. train door did not close properly due to a blockage). |
5eaf1b94c89939407995fc52470817be | 22.890 | 6.5 Automatic monitoring of Railway Smart Station | |
5eaf1b94c89939407995fc52470817be | 22.890 | 6.5.1 Description | The monitoring of a railway station is a hard work. It should be made in 24 hours a day, 7 days a week. It is carried out through dozens of CCTVs, a controller could not check all the CCTVs at a moment. To assist monitoring CCTV, a AI system gives help to the controller. The AI system is a part of Railway Smart Station services, and it has live streaming video input from CCTVs in the Railway Smart Station. The AI system inspects the input video streams, finds abnormal situations such as illegal riding, neglected wandering of suspicious object, unauthorized entry, or user falls from the platform. If it detects abnormal situation, makes notice of warning to the Station Staff and Control office.
Figure 6.5.1-1 Use case of Automatic Monitoring that covers emergency situation |
5eaf1b94c89939407995fc52470817be | 22.890 | 6.5.2 Pre-conditions | Some CCTVs are pre-designated to aim each part of the station in case of emergency.
A system of AI is trained to provide automatic monitoring functions for the railway smart station.
Some abnormal cases are pre-defined in the system. |
5eaf1b94c89939407995fc52470817be | 22.890 | 6.5.3 Service Flows | 1. The CCTVs in the station provide live streaming videos on the situation of the station such as platforms of the station.
2. The AI system looks at the video data from the dozens of CCTVs.
3. An abnormal situation is occurred. A passenger has fallen from the platform.
4. The AI system makes an alarm to the controller of the station to let him know the situation. The system also sends a notification alarm to station staff who are close to the place where the situation occurred.
5. The staff arrives at the accident site, rescues the passenger, and organizes the surrounding situation.
6. The controller is aware of the situation and contacts the train to prevent it from entering the platform.
7. The AI system records the video, call history, and actions taken in the process of handling abnormal situations as data of the future learning. |
5eaf1b94c89939407995fc52470817be | 22.890 | 6.5.4 Post-conditions | Passengers are rescued, circumstances are cleared up, and trains are allowed to enter the platform.
The data recorded by the AI system is later used in audits for handling the case. |
5eaf1b94c89939407995fc52470817be | 22.890 | 6.5.5 Existing features partly or fully covering the use case functionality | The group management are fully covered by 5G system and MCX framework.
Note: Please refer to TS 22.280 V18.2.0 [7].
[R-5.11-009] The MCX Service shall provide a means for an MCX UE to send a Location information update whenever a trigger condition is satisfied (e.g., initial registration, distance travelled, elapsed time, cell change, tracking area change, PLMN change, MCX Service communication initiation).
[R-6.15.4-004] The MCX Service shall provide a mechanism for a Mission Critical Organization to log at least the following metadata per communication: depending on service this may include; start time, date, MCX User ID, functional alias(es), MCX Group ID, Location information of the transmitting Participant, end time or duration, end reason, type of communication (e.g., MCX Service Emergency, regroup, private) and success/failure indication.
Note: Please refer to TS 22.281 V17.0.0.
[R-5.1.3.3.2-001] The MCVideo service shall provide a mechanism for an authorised MCVideo User to remotely start and stop local recording of video.
[R-5.1.3.3.2-002] The MCVideo service shall provide a mechanism for an authorised MCVideo User to remotely set triggers for automatic commencement of video transmission to authorised MCVideo Users; such triggers to include motion detection, time of day, face recognition, licence plate recognition, location and speed. |
5eaf1b94c89939407995fc52470817be | 22.890 | 6.5.6 Potential New Requirements needed to support the use case | No new potential requirements identified. |
5eaf1b94c89939407995fc52470817be | 22.890 | 7 Potential Consolidated Requirements | |
5eaf1b94c89939407995fc52470817be | 22.890 | 7.1 Introduction | The requirements below refer to a "Railway Smart Station Services", which is acting as an application a FRMCS and the outer systems.
The potential consolidated requirements are mainly focusing the 5G Network characteristics and the interfaces between FRMCS/MCX Functions and Railway Smart Station Services as the application of FRMCS and outer system of 3GPP.
Figure 7.3-1 Scope of the Potential Consolidated Requirements |
5eaf1b94c89939407995fc52470817be | 22.890 | 7.2 Functional aspects | Table 7.2-1 Functional Aspects Consolidated Requirements
CPR #
Consolidated Potential Requirement
Original PR #
Comment
CPR 7.2-1
The MCX service shall support obtaining and conveying MCX UE location information describing the positions of each MCX UE with different location accuracy simultaneously.
PR 6.3.6-1
intended to be included in clauses 5.11 and 6.12 of TS 22.280
CPR 7.2-2
The FRMCS shall be able to provide a mechanism to trigger an emergency alert based on a combination of UE location (e.g. the location of the specific platform door / train door) and application-generated trigger (e.g. train door did not close properly due to a blockage)
PR-6.4.6-1
intended to be included in a new clause of TS 22.280 |
5eaf1b94c89939407995fc52470817be | 22.890 | 7.3 Performance | Table 7.3-1: KPIs for Railway Smart Station Services
Scenario
(Note 5)
End-to-end latency
Reliability
(Note 1)
UE speed
UE Relative
Speed
User experienced data rate
Payload
size
(Note 2)
Area traffic density
Overall UE density
Service area dimension
(Note 3)
Multiple trains' stops at the same platform (Korea, urban railway)
≤10 ms
99.9999%
≤100 km/h
≤50km/h
≤1Mb/s
Small to large
≤ 1 Mb/s/km
≤ 5 (100m)
≤ 15 km
along rail tracks including bad weather conditions
(Note 4)
NOTE 1: Reliability as defined in TS 22.289 sub-clause 3.1.
NOTE 2: Small: payload ≤ 256 octets, Medium: payload ≤512 octets; Large: payload 513 -1500 octets.
NOTE 3: Estimates of maximum dimensions.
NOTE 4: Non-Line-of-Sight (NLOS) between UEs shall be supported
NOTE 5: Off-network traffic characteristics are not addressed in this table since it can be covered by TR22.990.
Note: This table is intended to be included in clause 6.2 of TS 22.289 [9]. |
5eaf1b94c89939407995fc52470817be | 22.890 | 8 Conclusions and Recommendations | This technical report collects use cases and derives potential requirements related to RAILSS. This TR also clarifies whether the identified requirements are supported by the current 5G system or whether they are new potential requirements. The consolidated potential requirements that are related to KPIs will be considered to be added in TS22.289. Most of other requirements may target MCX specifications, such as TS22.280 [7], TS22.281, and TS22.282 [8]. Annex A: Change history Change history Date Meeting TDoc CR Rev Cat Subject/Comment New version 09/2022 SA#97e SP-220935 Raised to v.1.0.0 by MCC, solving missing Figure 6.5.1-1 (taken from S1-222357) 1.0.0 09/2022 SA#97e - Raised to v.19.0.0 by MCC following SA one-step approval 19.0.0 |
3239937b55fdc406849fa93870986364 | 22.916 | 1 Scope | The present document describes use cases and aspects related to efficient communications service and cooperative operation for a group of service robots including:
◦ exposure of information between application layer and communications layer;
◦ support of on-demand high priority communications;
◦ KPIs for large-scale group operation scenarios;
◦ support of scalable and efficient use of communication resources;
◦ requirements related to media applications specific for service robots; and
◦ aspects related to security, privacy and charging
that are relevant to support stable operation of service robots. This document also describes the existing service requirements and potential correlation with other studies. |
3239937b55fdc406849fa93870986364 | 22.916 | 2 References | The following documents contain provisions which, through reference in this text, constitute provisions of the present document.
- References are either specific (identified by date of publication, edition number, version number, etc.) or non‑specific.
- For a specific reference, subsequent revisions do not apply.
- For a non-specific reference, the latest version applies. In the case of a reference to a 3GPP document (including a GSM document), a non-specific reference implicitly refers to the latest version of that document in the same Release as the present document.
[1] 3GPP TR 21.905: "Vocabulary for 3GPP Specifications".
[2] 3GPP TS 22.104: "Service requirements for cyber-physical control applications in vertical domains; Stage 1".
[3] 3GPP TS 22.261: "Service requirements for the 5G system; Stage 1".
[4] 3GPP TS 22.263, "Service requirements for video, imaging and audio for professional applications (VIAPA); Stage 1".
[5] Next G Alliance Report: 6G Applications and Use Cases, May 2022; https://www.nextgalliance.org/wp-content/uploads/dlm_uploads/2022/07/NGA-Perspective-Brochure-V6.pdf
[6] K.-D. Lee, "A Smart Network of Service Robots: Technical Challenges and Design Considerations," IEEE Communications Magazine, pp. 28-34, August 2021.
[7] K.-D. Lee and C. Gray-Preston, "Everyday Living Assisted by 6G Applications and Solutions," IEEE Wireless Communications Magazine, October 2022.
[8] W. Zhuang, Y. Shen, L. Li, C. Gao and D. Dai. “Develop an Adaptive Real-Time Indoor Intrusion Detection System Based on Empirical Analysis of OFDM Subcarriers.” Sensors (Basel, Switzerland). 2021 Mar;21(7):2287
[9] 1872-2015 IEEE Standard for Ontologies for Robotics and Automation
[10] 2755-2017 IEEE Guide for Terms and Concepts in Intelligent Process Automation
[11] http://dictionary.ieee.org
[12] NIST https://www.nist.gov/system/files/documents/el/isd/ks/ALFUS-BG.pdf
[13] SAE AS4D Committee
[14] NIST, Autonomy Levels for Unmanned Systems (ALFUS) Framework; https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=823618
[15] Multimodal Fusion for Objective Assessment of Cognitive Workload: A Review, September 2019 IEEE Transactions on Cybernetics PP(99):1-14, DOI:10.1109/TCYB.2019.2939399
[16] Data Fusion and IoT for Smart Ubiquitous Environments: A Survey, April 2017 IEEE Access PP(99):1-1, DOI:10.1109/ACCESS.2017.2697839
[17] MPEG Use cases and requirements for Video Coding for Machines; https://www.mpeg.org/wp-content/uploads/mpeg_meetings/138_OnLine/w21545.zip
[18] 3GPP TR 22.886: "Study on enhancement of 3GPP support for 5G V2X services”.
[19] 3GPP TR 26.926: "Traffic Models and Quality Evaluation Methods for Media and XR Services in 5G Systems
[20] A.-L. Kim, et al., "Development of change detection algorithm using high resolution SAR complex image," in Proc. 2015 IEEE Asia-Pacific Conference on Synthetic Aperture Radar (APSAR), Online: https://ieeexplore.ieee.org/abstract/document/7306330
[21] G.W. Morgenthaler, et al, "Feasibility of Using a Miniature-UAV Remote-Sensing/Precision-Agriculture System (MINI-UAV RSPA SPACE SYSTEM) to Increase Crop Yields, Lower Costs, Save Water, and Reduce Pollution of Air, Soil, and Water," Aerospace Research Central Nov. 2012. Online: https://arc.aiaa.org/doi/abs/10.2514/6.IAC-06-D3.2.09
[23] Juliet J. Lee, "Remote Sensing and Artificial Intelligence for Wildlife Conservation – A Survey," WPAC, 2023. Online: https://wvwpac.wixsite.com/westview/post/remote-sensing-and-artificial-intelligence-for-wildlife-preservation-a-survey
[24] E. Felemban, et al., "Underwater Sensor Network Applications: A Comprehensive Survey," International Journal of Distributed Sensor NetworksVolume 11, Issue 11, November 2015. Online: https://journals.sagepub.com/doi/epub/10.1155/2015/896832
[25] Ki-Dong Lee, “An efficient real-time method for improving intrinsic delay of capacity allocation in interactive GEO Satellite networks,” IEEE Transactions on Vehicular Technology, vol 53, no. 2, pp. 538 – 546, March 2004. Online: https://ieeexplore.ieee.org/document/1275718
[26] ITU-R Future technology trends towards 2030, Available online: https://www.itu.int/en/ITU-T/Workshops-and-Seminars/2023/0724/Pages/default.aspx
[27] Statista, https://www.statista.com/topics/1143/mining/
[28] 3GPP TS 22.847: "Study on supporting tactile and multi-modality communication services; stage 1 (Release 18)", 2022-03.
[29] Network-Enabled Robotic and Autonomous Systems, NextG Alliance, ATIS. May 2023. Available online: https://www.nextgalliance.org/white_papers/network-enabled-robotic-autonomous-systems/
[30] 3GPP TR 22.837, Study on Integrated Sensing and Communication – Stage 1.
[31] 3GPP TR 22.856, Study on Localized Mobile Metaverse Services – Stage 1. |
3239937b55fdc406849fa93870986364 | 22.916 | 3 Definitions of terms, symbols and abbreviations | |
3239937b55fdc406849fa93870986364 | 22.916 | 3.1 Terms | For the purposes of the present document, the terms given in 3GPP TR 21.905 [1] and the following apply. A term defined in the present document takes precedence over the definition of the same term, if any, in 3GPP TR 21.905 [1].
NOTE: Cited from IEEE 1872-2015 [9]
automated robot: A role for a robot performing a given task in which the robot acts as an automaton, not adapting to changes in the environment and/or following scripted plans..
fully autonomous robot: A role for a robot performing a given task in which the robot solves the task without human intervention while adapting to operational and environmental conditions.
orientation measure: Essentially a measure (Measure in SUMO) attributed to a (physical) object (Object in SUMO) concerning information regarding where the object is pointing to in relation to the reference object of the orientation coordinate system.
orientation region: Defines a region or interval orientation in relation to a reference object (Object in SUMO). For instance, the “south” interval of a compass constitutes an orientation region in the one-dimensional, circular coordinate system of the compass. Eventually, position regions and orientation regions are referred by similar words. For instance, it is valid to say that a robot is at the north position, facing north. The former relates to a position region, i.e., the north region of a given country; the later relates to an orientation region, i.e., the orientation interval around north on the compass.
orientation value: A value in a coordinate system denoting a specific orientation. Orientation values in one coordinate system can be mapped to other coordinate systems. An example of use of orientation value is in “the robot is oriented 54° in relation to the reference object.”
remote-controlled robot: A role for a robot performing a given task in which the human operator controls the robot on a continuous basis, from a location off the robot, via only her/his direct observation. In this mode, the robot takes no initiative and relies on continuous or nearly continuous input from the human operator.
robot actuating part: A role for devices (Device in SUMO) that allow for the robot to move and act in the surrounding environment.
robot communicating part: A role for devices (Device in SUMO) that serves as instruments in a robot, robot communication process or a human-robot communication process by allowing the robot to send (or receive) information to (or from) a robot or a human.
robot group: A group (Group in SUMO) of robots organized to achieve at least one common goal.
robot processing part: A role played by processing devices which allows the robot to process information.
robot sensing part: A role played by any measuring device (MeasuringDevice in SUMO) that allows the robot to acquire information about its environment.
robot: An agentive device (Agent and Device in SUMO) in a broad sense, purposed to act in the physical world in order to accomplish one or more tasks. In some cases, the actions of a robot might be subordinated to actions of other agents (Agent in SUMO), such as software agents (bots) or humans. A robot is composed of suitable mechanical and electronic parts. Robots might form social groups, where they interact to achieve a common goal. A robot (or a group of robots) can form robotic systems together with special environments geared to facilitate their work.
semi-autonomous robot: A role for a robot performing a given task in which the robot and a human
operator plan and conduct the task, requiring various levels of human interaction.
teleoperated robot: A role for a robot performing a given task in which a human operator, using sensory feedback, either directly controls the actuators or assigns incremental goals on a continuous basis, from a location off the robot. A teleoperated robot will complete its last command after the operator stops sending commands, even if that command is complex or time-consuming.
tandem sub-network: a group of robots (as a UE) that are serially connected to each other for a traffic session.
NOTE: According to this definition, a tandem sub-network can have a ring. For example, robot A is connect to robot C via an intermediate robot B for a traffic session, it is said that these three robots have formed a tandem sub-network. If robot A has two different paths to robot C for a single traffic session, one via robot B and the other via robot B2, these four robots are also said to have formed a tandem sub-network.
Editor’s Note: The above definition is FFS. |
3239937b55fdc406849fa93870986364 | 22.916 | 3.2 Abbreviations | For the purposes of the present document, the abbreviations given in 3GPP TR 21.905 [1] and the following apply. An abbreviation defined in the present document takes precedence over the definition of the same abbreviation, if any, in 3GPP TR 21.905 [1].
ALFUS Autonomy Levels for Uncrewed Systems
ASR Automatic Speech Recognition
NLP Natural Language Processing
NLU Natural Language Understanding
ORA Ontology for Robotics and Automation
R&A Robotics and Automation
RCS Rich Communication Services
SOBOT Service Robot
SUMO Suggested Upper Merged Ontology
TTS Text To Speech
VoLTE Voice over LTE
VoNR Voice over NR |
3239937b55fdc406849fa93870986364 | 22.916 | 4 Overview | The present document addresses the existing and expected roles of communications in supporting operational models of service robots, focusing on two main collaboration modes within a group operation model. Maintaining high communication availability is crucial for optimal robot group performance. The passage also emphasizes the importance of timely event-related information sharing, mentioning relevant studies and normative requirements. The document aims to analyse and document 3GPP 5G system support for groups of service robots in various usage scenarios, considering spectrum usage and diverse requirements from the robotics industry. Some of the features and aspects related to communication support for robot applications and group operations (such as those described in TS’s 22.186, 22.125, 22.261, and 22.263) are also listed as 'Related existing service requirements' in each use case. Additionally, some related studies and deployment scenarios are summarized in Clause 6.
In this overview, various applications of service robots are discussed across different scenarios:
- Building 3D Maps in Unstructured Environments (5.1): Energy-efficient robots collaborate to create 3D maps in areas such as cleaning, disinfection, and agriculture. They adapt actions based on environmental conditions, emphasizing accuracy while optimizing computing and communication resources.
- Enhancing Security Protection (5.2): Robots and security staff collaborate for patrolling, target identification, tracking, and alarm reporting in specific areas. Synchronized transmission ensures real-time response to security events, improving overall security.
- Smart Cooperation for Data Integration (5.3): Robots collaborate to build an information set through data and sensor fusion, sharing real-time data and deciding on raw or pre-processed data transmission based on fusion levels. Applications explored include diverse underwater sensor networks.
- Service Robots with Visual Sensors (5.4): Robots equipped with visual sensors focus on indoor video surveillance and intelligent transportation. They detect security events, recognize objects, and share processed data for enhanced situational awareness. Machine interpretation and optimized bandwidth usage are prioritized.
- Service Robots in Continuing Care Retirement Communities (5.5): Robots assist in crime prevention, medical emergencies, natural language processing, and gesture recognition in care communities. They patrol, respond to emergencies, control smart home appliances, and deliver groceries, enhancing safety and support for residents.
- Voicebots for Spoken Conversations (5.6): Voicebots aid individuals, especially the elderly, in accessing digital services and information. They operate through speech-to-text processing, audio sample transmission, and voice calls, ensuring real-time, natural conversation experiences.
- Geo-surface Sensing and Multi-access Edge Computing (5.7): Geo-surface sensing applications generate vast data, requiring efficient preprocessing. Multi-access Edge Computing (MEC) enhances service robots' performance and efficiency, enabling innovative applications in navigation, object recognition, and human-robot interaction.
- Robots in Mining (5.8): Robots perform diverse roles in the mining industry, including exploration, drilling, hauling, inspection, maintenance, and environmental monitoring. They operate in extreme conditions, enhancing safety and efficiency. Enhanced communications and sensing networks are crucial for underground mining operations.
Each scenario highlights specific challenges and technologies faced by service robots in different contexts, ranging from security and safety applications to data integration and mining operations. |
3239937b55fdc406849fa93870986364 | 22.916 | 5 Use cases | |
3239937b55fdc406849fa93870986364 | 22.916 | 5.1 Online cooperative high-resolution 3D map building | |
3239937b55fdc406849fa93870986364 | 22.916 | 5.1.1 General description | This use case considers a low-energy (or energy-efficient) cooperation scenario to collaboratively build a 3D map among a group of multiple robots, aiming at usage for unstructured settings, such as enterprise building cleaning, preparation for disinfection of large-scale building, and automation for agriculture. With cooperation among multiple robots gathering measurement data, it would be possible to either save energy or build a better quality outcome, or to attain both [5-7].
NOTE 1: Some aspects related to “automation for agriculture” can also be studied with a combined scenario of ground mobility and aerial mobility.
NOTE 2: The meaning of “map” in this use case is not necessarily limited to geographic appearance but it may also include still life objects that are useful or essential for robots working in an irregular and/or unstructured setting.
A group of service robots that are equipped with capabilities of multi-dimensional ambient sensing, computing (standalone and/or via compute fabric), federation in learning and model building, and 3GPP subscription-based communication, are in cooperation for a single joint project.
The availability of communication service to/from edge (or cloud) is threefold: not available, temporarily unavailable, or available (for certain period of time; positive interpretation although the term “available” does not mean “permanently available”).
NOTE 3: This use case is mostly focused on ProSe-based operation (also, referred to as “ProSe-enabled”) with partial or intermittent connection to NG-RAN (or to edge server via NG-RAN).
The edge (a server), if available for one or more of these service robots, will assist them to alleviate their computational burdens (that are or are not within the scope of 3GPP), giving rise to a demand of accessing service-specific network slice(s) or other forms of network resources with certain performance requirements.
An operator of robotic applications starts operating a group of service robots which are UEs.
These service robots discover each other and share their capabilities.
NOTE 4: For each service robot (UE), capabilities include certain characteristics such as types of supported RATs (e.g., NR, E-UTRA, or non-3GPP access technology) and information that are not within the scope of communications layer, such as remaining battery life.
All or some of these service robots form a working group (with one or more leader robots) and starts communicating.
Member robots send measurement data to a leader robot so that the leader robot can perform the next step to build a 3D map.
NOTE 5: The roles of leader robot(s) include coordination required for the operation of the working group of service robots, such as acting as sync master for other robots (sync devices) within the working clock domain.
These service robots scan environmental parameters, including 3GPP service availability, and collaboratively decide which operational scenario they should choose (i.e., Uu-based or ProSe-based, also referred to as ProSe-enabled).
Each service robot in the working group walks in coordination with each other, forming a gregarious cluster (i.e., distance between any pair is not unnecessarily far, degrading the performance of map building outcome).
Each service robot is exposed to uneven surface along its trajectory (e.g., signal angle measurement is not static, unpredicted loss of measurement accuracy level is likely to happen).
Depending on the accuracy level of 3D map at certain spot of the job site and decision made by the leader robot(s), the application layer of the leader robot requests to adjust the clock synchronisation target value within the clock synchronisation budget.
While moving along, one member robot, say robot A, faces some issue, resulting an unexpected drop in the moving speed.
Member robot A has already predicted this issue beforehand: its follow-up actions include reporting this information to a leader robot and marking time stamp on the measurement data with this outlier situation.
It is up to member robot A whether or not, to send the measurement data with outlier indication to a leader robot.
It is up to the leader robot whether or not, to use the received data with outlier indication, if received from member robot A, for 3D map building.
Later, member robot A gets a little bit away from the gregarious cluster, leading to a temporary loss of connection to a relay UE robot (or to gNB in Uu-based scenario). Member robot A promptly resumes a connection.
Fig. 5.1.1-1: Inter-robot operation example when a network of service robots that have ambient intelligence (e.g., intra-robot operation) are in cooperation for a joint project [5,7].
The working group of service robots can build up 3D map with only necessary level of accuracy so that they do not have to consume computing and communication resources to build up a 3D map of an area that is overly accurate.
Also, for an important area, they could adjust the level of accuracy.
They could prevent potential noise factors that could have contributed to the quality of 3D map with the help of prediction-based indication.
A robot that has instantaneously lost a connection can resume a connection very promptly and send time-critical information to other member(s). |
3239937b55fdc406849fa93870986364 | 22.916 | 5.1.2 Related existing service requirements | Clock synchronisation: 3GPP TS 22.104 [2]
- clause 5.6.1 Clock synchronisation service level requirements
- clause 5.6.2 Clock synchronisation service performance requirements
- clause 7.2.3.2 Clock synchronisation requirements
Timing resiliency: 3GPP TS 22.261 [3]
- clause 6.36.2 General requirements to ensure timing resiliency
- clause 6.36.3 Monitoring and reporting
- clause 6.36.4 Exposure
Multi-path relay: 3GPP TS 22.261 [3]
- clause 6.9.2.1 support of a traffic flow of a remote UE via different indirect network connection paths
Positioning: 3GPP TS 22.261 [3]
- clause 7.3.2 High accuracy positioning performance requirements (see also clause 5.7.1 of 3GPP TS 22.104 for Factory of the Future scenario)
Service continuity: 3GPP TS 22.263 [4]
- clause 5.5 Service continuity |
3239937b55fdc406849fa93870986364 | 22.916 | 5.1.3 Challenges and potential gaps | The following applicable aspects are identified and recommended for further study and can be further considered with other ongoing or recently completed Studies if applicable.
[CPG-5.1.3-001] 5G system is expected to be able to provide a means to ensure a very high accuracy level of clock synchronization to support that a group of service robots can build up 3D map collaboratively (i.e., synchronization among service robots within a collaborating group and synchronization among the multiple sources related to the respective service robots) in which the accuracy level is required by the applications layer.
NOTE 1: Clock synchronization accuracy is provided by 5G system in order to support applications that require time-sensitive communication. Depending on what is required by the robot application layer a higher accuracy level is expected. In scenarios where a group of robots connected to each other via multi-hop, the current accuracy level of clock synchronization (e.g., Cooperative carrying – fragile work pieces; (A.2.2.5), Table 7.2.3.2-1, 3GPP TS 22.104) [2] might not be sufficient. It is also discussed that a smaller clock synchronization budget might be necessary to support collaborative robots (cobots) scenarios where there are further network elements in the robots behind the UE.
[CPG-5.1.3-002] 5G system is expected to be able to ensure the integrity and validity of clock synchronization for a designated length of time when a group of service robots are in ProSe-based operation outside the coverage area served by NG-RAN.
NOTE 2: The time length is dependent upon the type of project and is required by a service robot’s application.
[CPG-5.1.3-003] If the integrity and validity of clock synchronization cannot be ensured for a designated length of time, the 5G system is expected to notify the application.
[CPG-5.1.3-004] 5G system is expected to be able to provide a means for UE(s) to adjust the accuracy level of clock synchronisation.
[CPG-5.1.3-005] 5G system is expected to be able to provide a means to share the accuracy level and integrity-related information of clock synchronization with the cloud (in Uu-based scenario) or with the leader robot (in ProSe-based scenario).
[CPG-5.1.3-006] 5G system is expected to be able to provide a means to resume the connection when an ongoing connection is disrupted (e.g., due to radio link failure b/w a robot and the communicating counterpart) within a very short period of time, required by the applications layer.
NOTE 3: The current time period for securely reconnecting is required to be less than 1s. Robotic applications that perform critical roles may require much shorter time period: <100ms for critical, <10ms for highly critical.
NOTE 4: The requirement is not intended to cover scenarios where the service robot experiencing more severe levels of disruptions, such as 3GPP registration state changes.
[CPG-5.1.3-007] 5G system is expected to be able to provide a means to allow a member robot that has predicted communication disruption or measurement failure to disseminate necessary information, which is required by the applications layer, to one or more destinations within a very short period of time required by the applications layer.
NOTE 5: The required time period is application dependent (e.g. <100ms for moderate level of disruption, <10ms for critical level of disruption). The above CPG is intended for both RRC Connected mode and RRC-Inactive mode. It is not intended that the above CPG is applicable to RRC Idle mode.
[CPG-5.1.3-008] Based on the request from the application (e.g., from an application of the leader robot in a robot group, or from an application in the cloud server), 5G system is expected to be able to determine a specific area in which the system could adjust the accuracy level of clock synchronization, and to be able to provide a means to expose the network capabilities (e.g. capability of monitoring “clock synchronization accuracy level”) and the monitoring results (e.g. the accuracy level measured in the area of interest) to the application.
NOTE 6: Possible scenarios regarding the referred robot group include a group of “automated robots”, a group of “fully autonomous robots”, a group of “tele-operated robots”, and a group that consists of a suitable combination of those kinds of robots.
NOTE 7: See Annex B for an example scenario for robot applications to adjust the accuracy level of clock synchronization provided by 5G. The accuracy level that 5G system provides might differ depending on their availability and the level that the application layer requires might differ depending on the time and environment their task is being performed.
NOTE 8: In 3GPP TS 22.104 [2], Table 5.6.2-1 presents various scenarios that require different clock synchronization accuracy levels. The above CPG (i.e., [CPG-5.1.3-008]) is intended for a specific application or task of a robot (or a group of robots) that may need to adjust the clock synchronization accuracy level due to changes in application or due to changes in 5G system’s capability. |
3239937b55fdc406849fa93870986364 | 22.916 | 5.2 Real-time cooperative safety protection | |
3239937b55fdc406849fa93870986364 | 22.916 | 5.2.1 General description | This use case considers the collaboration between security staff and robots to complete security protection of a certain geographical area, including patrolling based on the configured route, target identification, target tracking, intelligent detection, alarm report, etc. The security protection task requires real-time information sharing among robots, security staff and remote security controller. In addition, the decision or adjustment of security protection schemes, which may be made by a leader robot, a security person or a remote security controller, also needs to be received and executed by all the participants (e.g. robots or security staff) synchronously. Through the real-time collaboration among robots, security staff and remote security controller, the performance and efficiency of security protection can be improved. The real-time cooperative safety protection also can reduce the labour intensity and work risks of security staff, as well as the cost (e.g. the number of security staff).
For example, one of the most important features of smart factory is safety production solution. Robots play an important role in smart factory. A group of robots equipped with cameras and sensors are used to collect and report real-time information periodically according to the configured route. The security protection decision maker can be a leader robot, a security person or a remote security controller. Based on the latest global information, the decision maker determines whether there is a security event and how to respond. The potential events of security protection contain intrusion detection, fall detection, smoke and flame identification, critical access occupancy identification, helmet identification, etc.
Fig. 5.2.1-1: Real-time cooperative safety protection
A group of robots equipped with cameras, sensors and 3GPP-based communication capabilities (e.g. direct network connection, indirect network connection or both) cooperatively work together to complete security protection of a certain geographical area. According to the complexity of the security protection task, the intelligence level of robots and the quality of communication service, a security person equipped with a 3GPP-based UE or a remote secure controller may be needed.
A security protection task is configured and started, which includes patrolling based on the configured route, target identification, target tracking, intelligent detection, alarm report, etc. A leader participant is chosen according to the complexity of the security protection task, the intelligence level of robots, the quality of communication service, etc. The leader participant is in charge of collecting the latest information from other participants and determine the control information for other participants based on the latest global information. The leader participant can be a security person, a leader robot or a remote security controller. These three cases can be switched or used collaboratively. Taking a security person as the leader participant for example, service flows are described as follows.
1. Robots and the UE of security staff in the same security protection task discover each other and share their capabilities. The capabilities include both communication capabilities and service capabilities (e.g. sensor type, leader participant capability).
2. Based on the initial configuration, the leader participant (the UE of a security person) receives the latest information (e.g. location, target characteristic update, camera information, channel state information) from other participants (e.g. robots, other security person) periodically. The time period is about 1ms to 100ms depending on the security protection task. For example, the sampling rate of indoor intrusion detection is 50Hz (20ms) in [8]. When the camera information of each UE is video steam, data packets from a frame or video slice [19] are relevant. In some implementations all the data packets are needed, if some data packets are missing other data packets are useless. Therefore, to save the resources, when some data packets of a frame or video slice cannot be transmitted on time, other data packets from the same frame or video slice shall be discarded.
3. A single robot can only provide limited partial information, which is generally not enough for making decision. One of the potential processing methods is shown in Fig. 5.2.1-2, after the leader participant receives all the participants' data arrived in the first decision period (e.g., in the blue block, the leader participant processes all the data in the blue block to generate the global information. when some data packets of a participant cannot be transmitted on time, other data packets of other participants with the same time stamp shall be discarded to save network resources.
Based on the global information, the leader participant decides whether there is a security event (e.g. intrusion, fall, smoke, flame). The decision maker determines whether there is an intrusion every 100ms in [8]. Ideally the packet data of each participant with the same time stamp shall arrive at the leader participant at the same time. However, the channel status is different for all robots, which will lead to different QoS for the data transmission (e.g., transmission delay, reliability). In some cases, the data packets of a participant often arrive at the leader participant in the last 20ms. Although the data packets of other participants can arrive at the leader participant in the first 20ms, the leader participant still need to cache the received data (for at most 80ms) to wait for data from the last participant. Moreover, the decision cannot be processed until the last data arrives, which means higher computing capability is required. Considering the power consumption and size of UE/robot, the capacity of computing and cache/storage is limited. Synchronous transmission among the participants is required to save storage and simplify the processing logic of the UE/robot.
4. If the leader participant detects a security event based on the processing in step 3, the leader participant decides how to respond and sends control information to other participants. Upon receipt of the control information, all the participants will execute their own respond task cooperatively: some participants may move to their target locations at the specified time, some participants will track the specific target, other participants may broadcast alarm tone, etc.
5. Repeat step 2~4 until the security protection task is completed. The participants can use direct network connection or indirect network connection to communicate with each other based on the quality of communication service of each link.
When one of robots is the leader participant, the performance requirement of synchronized transmission may be stricter. Even the leader robot usually is with higher computing capability and intelligence level, it still needs more time to process information, hence the time for communication would be shorten. When the remote security controller is the leader participant, the latency of wireless network transmission may be shorter because of the longer distance.
Fig. 5.2.1-2: The schematic diagram of data transmission in a safety protection task
All the participants can share latest information synchronously and execute real-time control cooperatively. The assigned security protection task can be completed efficiently. |
3239937b55fdc406849fa93870986364 | 22.916 | 5.2.2 Related existing service requirements | Clock synchronisation: 3GPP TS 22.104
• clause 5.6.1 Clock synchronisation service level requirements
• clause 5.6.2 Clock synchronisation service performance requirements
• clause 7.2.3.2 Clock synchronisation requirements
Packet Delay Budget: 3GPP TS 23.501
• clause 5.7.4 Standardized 5QI to QoS characteristics mapping
TS 22.261 does not contain any requirements for synchronization transmission of multiple UEs. |
3239937b55fdc406849fa93870986364 | 22.916 | 5.2.3 Challenges and potential gaps | The following applicable aspects are identified and recommended for further study and can be further considered with other ongoing or recently completed Studies if applicable.
[CPG-5.2.3-001] 5G system is expected to be able to provide a means to ensure synchronous arrival of associated data transmission for a collaborating group of UEs within a defined accuracy level.
[CPG-5.2.3-002] 5G system is expected to be able to provide a means to support resource efficient data transmission when some data from one UE in a group is not needed according to the synchronization need required by trusted 3rd party for the collaborating group that the UE belongs to.
NOTE: CPG-5.2.3-002 focuses on preventing further data being transmitted by group members after a trusted 3rd party has determined that the synchronization needs are not met for the time period. |
3239937b55fdc406849fa93870986364 | 22.916 | 5.3 Smart Communication Support for Data Collection and Fusion Using Multimodal Sensors in Multi-Robot / Multi-Agent Scenarios | |
3239937b55fdc406849fa93870986364 | 22.916 | 5.3.1 General description | This use case considers a smart cooperation scenario for a group of robots to collaboratively build an information set (e.g., dataset or knowledge base in AI/ML) through data/sensor fusion [15] when the fusion of data from multimodal sensors is conducted by a group of robots.
NOTE: The term “fusion” in “data fusion”, “sensor fusion” and so on, is also exchangeably used as “integration”.
The term “smart” is intended to suggest a concept of consuming low-energy, energy-efficient, resource-efficient and/or situation-aware means of communication to support an intended fusion task for the group of robots.
The use of “levels of fusion” is expected to help the United Nations Sustainability Development Goals (SDGs) in several aspects.
Providing that the 5G advanced technology enablers are designed in resource-efficient ways for various types of resources (e.g., radio resources, network resources, material, such as battery related), such considerations can also help provide affordable 6G services in the society, especially when certain groups of residents, patients, public-safety officers, or underrepresented need the communication services the most at a critical point in time in their everyday living. Refer to Annex <A> for some examples that have already been identified.
There are various scenarios of multi-robot / multi-agent group operations in which a robot should be able to identify certain information (e.g., detecting an object, detecting multiple objects at the same time) or collect data that should be shared with other robots in that group in real-time. Fig. 5.3.1-1. shows two examples.
When a robot in a group begins to collect certain data (or information), the robot should determine what it should do with the data, such as whether to share the data without any pre-processing inside the robot (i.e., applications layer role utilizing some input coming from the communications layer), or to perform certain level of pre-processing before sharing the processed form of data with other participants (or participating robots) in the group for a certain task.
(a)
(b)
Fig. 5.3.1-1: Examples of using sensor data where objects are in different dimension/size and/or in different ranges. (a) Two distinct objects (A and B) of the same size at the different range (b) Two distinct objects (A and C) of different sizes at the same range (approximately).
Fig. 5.3.1-2 shows an example where both communication needs (i.e., sensor data that are outcome of one of multiple levels of fusion process inside the originating robot) and communication opportunities (i.e., how much communication resources are likely to be available for a robot when there are multiple robots in place) are fluctuating, leading to a complex scheduling load onto 5G systems, such as at a RAN node or a CN node.
In order for 5G systems to be able to efficiently and reliably support the dynamic need of transmission opportunities, it is necessary to ensure that robots (as a UE) should be provided with a suitable means to share their intents (e.g., levels of fusion, desired amount of traffic to transmit at certain point in time).
Fig. 5.3.1-2: Example of different levels of communications opportunity need (or transmission opportunity need) under a combination of normal and challenging (or extreme) communication conditions. Both communication needs (e.g., traffic volume and communication link availability) can be different and dynamically fluctuating subject to changes in a given environment.
Figures 5.3.1-3a presents some examples of underwater sensor network applications and their classifications, as presented by Felemban et al. [24]. Figures 5.3.1-3b presents a common architecture for underwater sensor network (UWSN). The authors presented several categories of applications, such as monitoring applications, disaster applications, military and assisted navigation applications, based on a few underwater models:
• 1D-UWSNArchitecture. One-dimensional- (1D-)UWSN architecture refers to a network where the sensor nodes are deployed autonomously.
• 2D-UWSN Architecture. Two-dimensional- (2D-) UWSN architecture refers to a network where a group of sensor nodes (cluster) are deployed underwater.
• 3D-UWSN Architecture. In this type of network, the sensors are deployed underwater in the form of clusters and are anchored at different depths.
• 4D-UWSN Architecture. Four-dimensional- (4D-)UWSN is designed by the combination of fixed UWSN, that is, 3D-UWSN and mobile UWSNs.
Fig. 5.3.1-3a: Examples of underwater sensor network applications [24].
Fig. 5.3.1-3b: Underwater sensor network architecture [24]. |
3239937b55fdc406849fa93870986364 | 22.916 | 5.3.2 Related existing service requirements | Clock synchronisation: 3GPP TS 22.104
• clause 5.6.1 Clock synchronisation service level requirements
• clause 5.6.2 Clock synchronisation service performance requirements
• clause 7.2.3.2 Clock synchronisation requirements
NOTE 1: The types of sensor data and media that robots are collecting, pre-processing and sharing with each other and/or with edge cloud (or edge server, cloud server) are related to the need of fulfilling the above sets of requirements. Clock synchronization requirements are mostly related to ProSe communication scenarios.
Timing resiliency: 3GPP TS 22.261
• clause 6.36.2 General requirements to ensure timing resiliency
• clause 6.36.3 Monitoring and reporting
• clause 6.36.4 Exposure
NOTE 2: Timing resiliency is considered as a set of preconditions that ensure the “clock synchronization” especially when robots (as a UE) or the leader robot(s) (as opposed to “robot followers”) are served by at least one PLMN.
Multi-path relay: 3GPP TS 22.261
• clause 6.9.2.1 support of a traffic flow of a remote UE via different indirect network connection paths
Service continuity: 3GPP TS 22.263
• clause 5.5 Service continuity
NOTE 3: Service continuity is not necessarily related to all types of sensor data and media.
The following aspects are considered as potentially covered by the existing service requirements (Refer to Table 5.3.2-1). It is FFS whether there exist some gaps that are not identified.
Table 5.3.2-1: Support of communications layer adaptive to the use of different levels of fusion and to dynamic changes of communication resource availability.
Approach by “Levels of Fusion”
Existing features / requirements
Remarks
Thee-level approach
[CPG-5.3.2-002a] [Underground model] 5G system is expected to be able to support up to [TBC] robots with a deployment and operation model with a range of less than [10 m] (depth) x [50 m] (radius, horizontal range).
(NOTE 1)
[CPG-5.3.2-002b] [Near-ground surface model 1] 5G system is expected to be able to support up to [TBC] robots with a deployment and operation model with a range of less than [10 m] (height) x [500 m] (radius, horizontal range).
(NOTE 2)
[CPG-5.3.2-002c] [Near-ground surface model 2] 5G system is expected to be able to support up to [TBC] robots with a deployment and operation model with a range of less than [10 m] (height) x [50 m] (radius, horizontal range).
(NOTE 3)
[CPG-5.3.2-002d] [Underwater model] 5G system is expected to be able to support up to [TBC] robots with a deployment and operation model with a range of less than [50 m] (depth) x [1000 m] (radius, horizontal range on water surface).
(NOTE 4)
NOTE 1: Radio propagation characteristics can affect the performance but are not the scope of stage-1 study.
NOTE 2: Model 1 is related to initial search in, e.g., urban search and rescue scenarios.
NOTE 3: Model 2 is related to intensive search in, e.g., urban search and rescue scenarios.
NOTE 4: The path loss exponent for underwater (Line of Sight) ranges from 2 to 4. However, compared to fresh water conditions, the seawater is known to have a higher value of water conductivity (greater than 2) and to have a higher absorption loss of electro-magnetic waves. |
3239937b55fdc406849fa93870986364 | 22.916 | 5.3.3 Challenges and potential gaps | The following applicable aspects are identified and recommended for further study and can be further considered with other ongoing or recently completed Studies if applicable. The following aspects in Table 5.3.3-1 are expected to be supported.
Table 5.3.3-1: Support of communications layer adaptive to the use of different levels of fusion and to dynamic changes of communication resource availability.
Approach by “Levels of Fusion”
Challenges and potential gaps
Remarks
Thee-level approach
[CPG-5.3.3-001] 5G system is expected to be able provide a suitable means with a very high efficiency and reliability to accommodate the dynamic changes at a robot’s application layer related to traffic demand (e.g., caused by using different levels of data/sensor fusion within a robot.
(NOTE 1)
Editor’s Note: In the above CPG, the degrees of efficiency and reliability are FFS.
[CPG-5.3-002] The 5G system is expected to provide a suitable means for the application layer of robots, based on its availability and capability to allocate network resources (e.g., network slice) specifically for robot applications.
NOTE 2: This allows for suitable adaptations in the communication layer, addressing dynamic changes occurring in a robot's application layer
Six-level approach
FFS
(NOTE 3)
NOTE 1: Examples include a suitable API that can support many intra-robot sessions between robot’s applications layer (e.g., “robot sensing part”, “robot processing part”, “robot actuating part”) and communications layer (e.g., “robot communication part”).
NOTE 3: It is considered more complex to apply, compared to three-level approach due to the increased quantity/dimension of computation and communication needs. |
3239937b55fdc406849fa93870986364 | 22.916 | 5.4 Media-related use cases | |
3239937b55fdc406849fa93870986364 | 22.916 | 5.4.1 General description | |
3239937b55fdc406849fa93870986364 | 22.916 | 5.4.1.1 Video surveillance | The use cases focus on media aspects of service robots composed of visual sensors such as cameras and equipped with capabilities to analyse video signals (e.g. feature extraction, object tracking and detection…) and are able to process this information (i.e., take consequent action). Those use cases are directly related to the standardization effort in MPEG for defining a media compression format called Video Coding for Machines (VCM). Associated MPEG use cases and requirements are available [17].
This use case addresses the scenarios in which automatic object detection and tracking is achieved with video cameras. One illustration is the monitoring of indoor environments in which features of the captured video need to be extracted, such as intrusion detection, fire detection, trusted individuals recognition.The basic service configuration is depicted in the Figure 5.4.11.-1 below.
Figure 5.4.1.1-1: Use case on video surveillance
In this scenario, video cameras capture a video stream together with related features extracted from pre-processing such as object detection and tracking and are sent to the back-end server for analysis and processing.
The uplink bandwidth is expected to be used in an optimized way in a configuration in which there is no need for human understandability of the signal. The image quality requirements are thus limited to the machine interpretation. |
3239937b55fdc406849fa93870986364 | 22.916 | 5.4.1.2 Intelligent transportation | In a smart traffic system, cars may need to communicate features between each other and other sensors in order to perform different tasks. Sensors in the infrastructure may communicate features towards different vehicles, which then use these features to do object detection, lane tracking, etc. Final processing of these features is done on the individual vehicles.
An example is illustrated in Figure 5.4.1.2-1. The front car with multiple cameras sees the surrounding environment and detect and recognize objects such as cars, pedestrians, or street furniture or even recognize events such as traffic jams or accidents potentially using (deep) neural networks. The processed data (feature maps) may be consumed internally for desired tasks and/or the extracted features may be compressed and transmitted to other surrounding cars/infrastructure (e.g., side road cell/grid) for further analysis. Sending a standardized compact bitstream is essential for interoperability between various vendors, IoV (Internet of Vehicle), and IoT (Internet of Things) applications.
Figure 5.4.1.2-1: Use case on V2X communications
The bandwidth is expected to be used in an optimized way in a configuration in which there is no need for human understandability of the signal. The image quality requirements are thus limited to the machine interpretation. |
3239937b55fdc406849fa93870986364 | 22.916 | 5.4.2 Related existing service requirements | There is no service requirement known that relates to the bitrate efficiency of robotics video signal, however the Release 16 study on enhancement of 3GPP support for 5G V2X services in TR 22.886 [18] describes the KPIs for video data sharing particularly for machine-centric video data analysis. The following parameters are used as related KPIs:
- Latency: less than 10ms;
- Data rate 100 – 700 Mbps; and
- Reliability 99.99%. |
3239937b55fdc406849fa93870986364 | 22.916 | 5.4.3 Challenges and potential gaps | The following applicable aspects are identified and recommended for further study and can be further considered with other ongoing or recently completed Studies if applicable.
Functionality aspects:
[CPG-5.4.3-001] 5G system is expected to convey media signals for machine type communications that support feature extraction and descriptor signalling.
NOTE: The type of feature and descriptor is out of scope of this document.
Efficiency aspects:
[CPG-5.4.3-002] 5G system is expected to meet the service requirements to enable communication between robots using media formats offering at least a better compression efficiency than the 3GPP codecs already specified for human consumption. |
3239937b55fdc406849fa93870986364 | 22.916 | 5.5 Smart community | |
3239937b55fdc406849fa93870986364 | 22.916 | 5.5.1 General description | This use case considers the service robots in Continuing Care Retirement Community (CCRC). CCRC is the community that provides the continuing care for the elderly. There are several types of assistance and care required in the community. First is the independent living, in which residents could take care of themselves and require limited assistance from the assistants. Second is the assisted living, in which residents need help as needed with daily tasks such as bathing and dressing. Third is the 24-hour nursing home care, which usually in a dedicated skilled nursing facility.
The service robots within the community could help to maintain and improve the liveability of the community. CCRC requires a lot of labour to take care of the residents, therefore, service robots could help by liberating the work from human beings.
There are multiple types of robots served in the community:
1) patrol robots for crime prevention:
Service robots could be equipped with cameras or sensing entities that could record the surroundings or do sensing service. The service robots could record the videos or take photos of the surroundings and then send the video or photo to the server for processing and detect the potential crime. The service robots might not have the computation and resources to process the gathered information from itself, in order to have rapid action on the potential crime, the video or photo could be uploaded to the local edge server for processing.
Also, patrol robots could help with the community security and personnel security, such as detecting the fallen elderly around the community for a quick rescue. There could be uncrewed aerial vehicle patrol robots and patrol robots on the ground as shown in the following figure.
The cellular coverage usually doesn’t cover the whole community. Relays could be placed within the community to extend the cellular coverage for ensuring the communication of the patrol robots, for example, placing them with the surveillance cameras, which are assumed to be well located within the community.
Fig. 5.5.1-1: Patrol robots in the CCRC.
Daily patrolling:
1. When the patrol robot is in weak cellular coverage, it automatically switches to the indirect network connection. It could connect to the base station via the stationary relays that are placed with the surveillance camera. When it detects the connection to the base station is well enough, it would switch to the direct network connection. When the patrol robots send the surveillance video to the control centre, they could establish multiple indirect network connection paths.
2. The patrol robot would take videos or photos when it detects the security event from its sensing unit and send to the control centre.
3. During the daily patrolling, the patrol robots might detect the risk of the community safety (e.g. the thief) and the patrol robots will track the thief and report the real-time location to the security department. According to the report from the patrol robots, the security department might provide tracking assistant information and tracking instructions to the patrol robots from its surveillance system. The patrol robots are moving around the community and the connection paths might be changed during the movement of the patrol robots. Service continuity is required for robots to get clear instructions for tracking.
Medical assistance event:
Fig. 5.5.1-2: Medical assistance event with patrol robots.
1. When an elderly feels uncomfortable at home, he/she could ask for medical assistance from the control centre of the CCRC by pressing a bottom in his/her home. The control centre would dispatch the nearest patrol robots for measuring and monitoring the vital signs of the elderly, such as heart rate, blood pressure and blood oxygen by connecting with the sensing devices.
2. In order to dispatch the nearest patrol robot, the control centre should obtain the location information of the patrol robots, and the patrol robots should be equipped with dynamic path planning to find the nearest path. Communication between the control centre and the patrol robots could be direct network connection or indirect network connection.
3. The patrol robot gets a temporary authorization to connect to the elderly personal smart devices to monitor his/her health condition.
4. When the patrol robot arrives, it connects with the smart watch that the elderly is wearing or smart devices at home with sensing capabilities, which could monitor the vital signs. The vital signs of the elderly could be transmitted to the first-aider via the smart watch and the patrol robot. Also, the patrol robot is equipped with camera that could capture the live video of the elderly for health monitoring. The live video from the camera and the vital signs transmission should be synchronised.
5. Once the first-aider arrives, the first-aider could operate the medical treatment immediately.
2) natural language or gesture recognition from the service robots:
Residents could use the service robots with natural language processing capability for smart home appliances, e.g. smart refrigerators, smart speakers, and smart washing machines. Additionally, the gesture recognition capability of the service robots could also help with the remote control of smart home appliances. Natural language and gesture recognition requires processing capabilities, and the traffic of voice or video could be offloaded to the edge server for processing meeting the low latency requirement compared with communication via the cloud computing.
3) smart transportation for delivery robots:
Service robots could help to deliver the grocery or parcels to customers. Smart delivery route planning could be based on the local map of the community. The local map might include the privacy information of the resident, which could be stored at the local edge server for privacy consideration.
NOTE: Depending on regional or national laws, a delivery robot is considered in a different ways: (a) if they are considered as a vehicle (e.g. in South Korea) they should follow the rules that a regular road vehicle does and some or all of V2X-related aspects are suitable for service scenarios; (b) if they are considered as a vulnerable road user under certain conditions (e.g., speed limit, weight limit), they are allowed to operate in sidewalks (e.g., in State of Pennsylvania) . |
3239937b55fdc406849fa93870986364 | 22.916 | 5.5.2 Related existing service requirements | Indirect network connection: 3GPP TS 22.261
• Clause 6.9 connectivity models requirements
Positioning: 3GPP TS 22.261
• Clause 6.27.2 Positioning services requirements
Management of a PIN: 3GPP TS 22.261
• Clause 6.38.2.2 Authorizing/deauthorizing PIN Elements with Management Capability
Efficient user plane: 3GPP TS 22.261
• Clause 6.5 efficient user plane
V2X aspects: TS 22.186
• Clause 5.3 Advanced Driving
• Clause 5.5 Remote Driving
NOTE: V2X aspects are suitable for service scenario in countries or regions where a delivery robot is considered a vehicle (e.g., in South Korea). |
3239937b55fdc406849fa93870986364 | 22.916 | 5.5.3 Challenges and potential gaps | The following applicable aspects are identified and recommended for further study and can be further considered with other ongoing or recently completed Studies if applicable.
[CPG-5.5.3-001] 5G system is expected to maintain service continuity of an indirect network connection for a remote UE when the communication path to the network changes while the remote UE is using multiple indirect network connection paths for a single traffic flow. |
3239937b55fdc406849fa93870986364 | 22.916 | 5.6 Real-time conversational robot | |
3239937b55fdc406849fa93870986364 | 22.916 | 5.6.1 General description | This scenario proposes a service robot who participates in a spoken conversation with a human.
Voicebots can enable an elderly person to access those digital services which use is difficult for them due to physical or cognitive challenges. Voicebots are well suited especially for primary contacts with customer service, because they make it easier and faster to access the services. In addition, voice bots can help in gathering information related to elderly’s wellbeing and functioning, for example. This can speed up health care personnel’s response to the customer's service needs, making the allocation of resources easier while improving patient safety as a consequence.
In addition, voicebots can support health care professionals to perform routine tasks which often requires lot of resources. Voice-based solutions can speed up, for example, making appointments, reporting laboratory results, or customer surveys for large groups. In this manner, professionals’ resources would be freed up for those tasks where there is need for human service.
Fundamentally, a voicebot can work in one of three ways:
1) The user’s device works on the text transcript of the customer’s speech, obtained using an Automatic Speech Recognition (ASR) engine, send this text transcript to a cloud or Edge based Natural language processing (NLP) and natural language understanding (NLU) entity capable of processing text transcripts, receive a response, and the user’s device converts the text response to voice using a Text-to-speech (TTS) engine,
2) The user’s device records voice input as audio samples, transmits these audio samples to a cloud or Edge based NLP and natural language understanding (NLU) entity capable of processing speech audio, receive a response, and the user’s device converts the text response to voice using a TTS engine,
3) The user’s device establishes a voice call (VoIP, VoLTE, VoNR, RCS, or other) towards a cloud or Edge function which directly interprets the speech, performs analysis and response formulation, before directly responding to the user by generating speech.
There are latency constraints with solution (1) (many ASR and TTS engines each consume a second of processing time, rendering the speech flow unnatural) which might limit suitability for conversational services. Typically, solution (3) is preferred in the voice assistant industry where conversational response times are less critical. It is proposed here that for voicebot services, solution (3) is the prime candidate. However, it should be noted that if the seconds of latency experienced in solutions (1) and (2) are covered by some language or some sound as the voicebot retrieves information, the perceived latency may be much less insignificant compared to the actual latency.
Solution (3) results in the requirement that the 3GPP system needs to be able to maintain a resilient voice connection with sufficient throughput and quality to enable the user’s speech to be adequately processed at the cloud or Edge ASR engine, to allow for voicebot processing and response generation, and for the voicebot’s response to be received by the user with sufficiently low latency as to enable a perception of “humanity” by the user. |
3239937b55fdc406849fa93870986364 | 22.916 | 5.6.2 Related existing service requirements | TS 22.261, clause 7.6.1: To support interactive task completion during voice conversation, the 5G system shall support low-delay speech coding for interactive conversational services (100 ms, one-way mouth-to-ear).
ITU-T Recommendation G.114 & 3GPP TS 26.114: See Figure 5.6.2-1 for the relationship between mouth-to-ear delay (one way transmission time for voice, also consider this as voicebot-to-ear delay) and perceived quality by the user
Figure 5.6.2-1: ITU-T Rec. G.114 – Determination of the effects of absolute delay by the E-model. |
3239937b55fdc406849fa93870986364 | 22.916 | 5.6.3 Challenges and potential gaps | None. |
3239937b55fdc406849fa93870986364 | 22.916 | 5.7 MEC for Efficient Management of Geo-surface Sensing Data Using a Group of Aerial Robots | |
3239937b55fdc406849fa93870986364 | 22.916 | 5.7.1 General description | Geo-surface Sensing:
Geo-surface sensing is a wide area of remote sensing applications encompassing, but not limited to:
• agricultural use cases such as detecting the change of farming field surface (e.g., growth and/or other conditions of plants) [20,21]
• wildlife preservation use cases such as detecting / tracking the movement or change of certain kind of animals of interest [23]
• 3D architectural design and urban planning use cases: urban planning covers both new development and renovation that are supported by geo-surface sensing technology combined with data collection, analysis and prediction via collaboration of multiple entities (e.g., autonomous robots capable of role-playing in various settings), which require communications.
• many more
which are expected to produce a huge amount of measurement data that would require certain level of pre-processing and sharing with participating role-players for efficient data fusion. The quality of outcome of data fusion is an important optimization objective and, at the same time, the efficiency of the process required to draw out quality outcome is also critically important. Considering the United Nations’ Sustainable Development Goals, study on recommendations and requirements is essential for developing resource-efficient (e.g., energy-efficient, time-efficient) and sustainable solutions.
Multi-access Edge Computing (MEC) for SOBOT:
Edge cloud service (or MEC) for service robots is an emerging technology aimed at enhancing the performance and efficiency of these robots. It involves granting robots access to computing resources located closer to them, resulting in reduced latency and improved service quality.
Using edge cloud service offers several advantages for service robots. Firstly, it enhances their performance by providing them with access to more robust computing resources. This enables robots to execute complex tasks, such as navigating intricate environments or engaging in natural interactions with humans.
Secondly, edge cloud service improves the efficiency of service robots by offloading some processing tasks to the edge cloud. This alleviates the robot's own resources, extending its battery life and making its operation more cost-effective.
While still in the developmental stage, edge cloud service has the potential to revolutionize the use of service robots. By equipping robots with powerful computing resources, this technology enhances their performance and efficiency, paving the way for innovative applications.
Below are examples illustrating how edge cloud service can benefit service robots:
• Intelligent navigation: Real-time information about the robot's surroundings, including obstacle locations, can be provided through edge cloud service. This assists in improving navigation performance and avoiding collisions.
• Object recognition: Edge cloud service enables robots to recognize objects in their environment, enhancing their ability to interact with objects and accomplish tasks.
• Human-robot interaction: By leveraging edge cloud service, robots can communicate and interact with humans in a more natural manner. This facilitates better comprehension and response to human commands.
Edge cloud service is a promising new technology that has the potential to revolutionize the way that service robots are used. By providing robots with access to more powerful computing resources, edge cloud service can improve the performance and efficiency of robots, which can lead to new and innovative applications for service robots.
Fig. 5.7.1-1 presents some examples of usage scenarios where MEC (Edge cloud service) is used for geo-surface sensing data management. The expected types of aerial robots are fully autonomous robots, semi-autonomous robots, remote-controlled / teleoperated robots. The robots are expected to be able to meet the orientation value requirement, required by the robot operator or by the operating applications that are coordinated by the robot operator in the edge cloud.
Fig. 5.7.1-1: Usage scenarios of MEC (Edge cloud service) for geo-surface sensing data management. Some features and KPI characteristics for communication are recommended. |
3239937b55fdc406849fa93870986364 | 22.916 | 5.7.2 Related existing service requirements | Clock synchronisation: 3GPP TS 22.104
• clause 5.6.1 Clock synchronisation service level requirements
• clause 5.6.2 Clock synchronisation service performance requirements
• clause 7.2.3.2 Clock synchronisation requirements
NOTE 1: The MEC scenario described in 5.X.1 assumes collaboration among the group of aerial robots. The data collection and sensor fusion aspects (in clause 5.3 and in Annex A – Levels of Fusion) are still important considerations for this MEC scenario.
NOTE 2: The types of sensor data and media that robots are collecting, pre-processing and sharing with each other and/or with edge cloud (or edge server, cloud server) are related to the need of fulfilling the above sets of requirements.
Multi-path relay: 3GPP TS 22.261
• clause 6.9.2.1 support of a traffic flow of a remote UE via different indirect network connection paths
Positioning: 3GPP TS 22.261
• clause 7.3.2 High accuracy positioning performance requirements (see also clause 5.7.1 of 3GPP TS 22.104 for Factory of the Future scenario)
Efficient user plane: 3GPP TS 22.261
• Clause 6.5 Efficient user plane
Service continuity: 3GPP TS 22.263
• clause 5.5 Service continuity |
3239937b55fdc406849fa93870986364 | 22.916 | 5.7.3 Challenges and potential gaps | The following applicable aspects are identified and recommended for further study and can be further considered with other ongoing or recently completed Studies if applicable.
[CPG-5.7.3-001] A robot that is participating in a group project and is designated (or nominated) by the robot operator is expected to be able to meet the orientation value requirement, required or adjusted by the robot operator or by one or more operating applications that are coordinated by the robot operator in the edge cloud.
NOTE 1: The orientation value defined and required by the robot operator is expected to be used by communications layer(s), e.g., for ranging, when certain communications layer KPI target needs to be adjusted for robot operation.
NOTE 2: In this scenario, a robot consists of robot communicating part and, optionally, a combination of robot actuating part, robot processing part, and robot sensing part.
NOTE 3: The expected types of robots listed in Clause 5.X.1 are intended to describe the robots’ capability to maneuver over the project site (job site) but are not intended to describe their capability to control / adjust orientation for sensing and measurements. Such capabilities are intended to be specified in combination of communication features in CPGs.
[CPG-5.7.3-002] 5G system is expected to provide a means to ensure continuity of session when a subset of robots participating in a group project need handover (i.e., switching between source gNB and target gNB) or trigger a relocation from the current MEC server to a new MEC server.
NOTE 4: The term “continuity of service” can be interpreted in a few different ways, depending on the type of robotic applications. For example, for real-time acoustic and/or audio sensing data, it can be interpreted as “service continuity” (i.e., within 20 ms of interruption), for 3D RF sensing measurement data, it is recommended to ensure up to 10ms of interruption not including the intrinsic propagation delay (e.g., a total of approximately 500 ms for one round trip to a ground station over a GEO satellite, 4 * 125 ms) [25]. For example, the minimum value occurs when a UE is located at a point on the equator closest to the satellite, 36 000 km / (3 * 10^5 km / s) ≈ 120 ms..
[CPG-5.7.3-003] 5G system is expected to provide a means to ensure the acceptable range of arrival time difference(s) of information, defined and required by the robot operator for particular robotic application(s), when the robotic application(s) uses multiple routes for information delivery (e.g., one terrestrial route and one non-terrestrial route).
[CPG-5.7.3-004] 5G system is expected to provide a means to expose the capability on propagation delay-related information when offering candidate routes to the robotic application(s).
[CPG-5.7.3-005] 5G system is expected to provide a means to provide expected disruption time for the robot when communication path switching (e.g., from satellite to terrestrial, or vice versa) is necessary so that the robot can select suitable alternative path according to its preference and traffic characteristics. |
3239937b55fdc406849fa93870986364 | 22.916 | 5.8 A group of autonomous robots and tele-operated robots working on mining actuation and delivery | |
3239937b55fdc406849fa93870986364 | 22.916 | 5.8.1 General description | |
3239937b55fdc406849fa93870986364 | 22.916 | 5.8.1.1 The use of a group of robots in mining: | It is expected that the mining industry pave new pathways toward global sustainability, including “greater inclusion and diversity effort” (e.g., worker safety, well-being of employees in general), green/clean energy transition, material consumption reduction, deep sea mining explortion, greenfield exploration [27]. More interestingly, the industry is with no exceptions as other industries, exploring a new path to “cloud-integrated mining processes”.
Fig. 5.8.1.1-1 Examples of extreme working conditions in mining site (underground).
Fig. 5.8.1.1-2 Examples of extreme working conditions with workers’ roles replaced by robots. Integrated sensing and communications (ISAC), distributed sensing and communication (data collections), and AI-enabled compute are expected.
Among these, (1) greater inclusion and diversity effort (with a narrower angle of worker safety using tele-operated robots in mining) and (2) cloud-integrated mining processes are interesting topics for consideration in the telco domain (refer to Fig. 5.8.1.1-1 and Fig. 5.8.1.1-2).
Robots have a range of applications in the mining industry, contributing to increased safety, efficiency, and productivity. Here are some tasks that robots can perform in mining:
1. Exploration and Mapping: Robots can be equipped with various sensors, such as LiDAR and cameras, to explore and map underground or hazardous areas that might be dangerous for humans.
2. Drilling and Blasting: Automated drilling and blasting robots can accurately and safely bore holes for explosives, increasing precision and minimizing the risk to human operators.
3. Hauling and Transport: Robotic vehicles can be used for hauling materials, removing the need for human drivers in dangerous environments. These robots can transport materials within mines and even across long distances.
4. Inspection and Maintenance: Robots can inspect equipment and infrastructure, identifying issues before they become serious. They can also perform maintenance tasks in hazardous areas, reducing the need for human workers in risky environments. Note: Maintenance includes corrective maintenance and preventive maintenance. Predictive maintenance is related to preventive maintenance.
5. Remote Operation: Teleoperated or semi-autonomous robots can be controlled by operators from a safe location, allowing them to work in environments that are unsafe for humans.
6. Hazardous Environment Exploration: Robots can be deployed in areas with extreme temperatures, toxic gases, or other hazardous conditions, where human presence would be dangerous.
7. Material Sorting and Processing: Robots can be programmed to sort and process mined materials, improving efficiency and accuracy in material separation.
8. Surveying and Mapping: Robots equipped with advanced sensors can create detailed 3D maps of mining sites, helping with planning and optimization.
9. Search and Rescue: In the event of a mine collapse or other emergency, robots equipped with cameras and sensors can be used to search for trapped miners and assess the situation.
10. Environmental Monitoring: Robots can be used to monitor air quality, water quality, and other environmental factors in and around mining sites.
11. Dust Suppression: Robots can be designed to control dust levels, which is crucial for the health and safety of miners.
12. Rehabilitation and Land Restoration: After mining operations cease, robots can be employed to rehabilitate and restore mined areas, aiding in reforestation or other environmental recovery efforts.
The use of robots in mining can improve safety for human workers, increase operational efficiency, and enable the extraction of resources from challenging and hazardous environments.
Motivation and Discussion:
In the on-surface environment of radio signal propagation (i.e., one being line-of-sight reception and the other a slightly longer reflected one, which are combined at the receiver’s side), the signal transmission and reception situation is not ideal due to the inherent 'phase difference' caused by reflection, as depicted in Fig. 5.8.1.1-3 below. The point that we bring in is to lead to the main motivating message that the signal propagation situation is so difficult in underground tunnels (as justified in the point #2 below).
Fig. 5.8.1.1-3: Phase difference caused between two antennas on a horizontal ground surface with different heights. The phase difference is depedent upon seveal parameters, the antenna heights, and some other parameters, such as a, b, l, and θ.
In the underground tunnel environment, there are two characteristics that are simply observed:
1) motivating argument #1: The signal reflections are multifold around the inner surface of the tunnel which is furthermore “uneven”:
2) motivating argument #2: The communication nodes (i.e., UEs/Robots or mining carts that have UE functionality) are distributed along the tunnel pathway.
3)
Justification on point #1 (as motivating argument #1):
In underground tunnels, radio propagation faces significant challenges due to the confined and reflective nature of the environment. Signals can be absorbed, diffracted, or reflected by the tunnel walls, leading to multipath propagation. This phenomenon creates multiple signal paths between the transmitter and receiver, causing delays and phase differences in the received signals. As a result, the radio propagation situation in underground tunnels is complex, making it difficult to achieve reliable and stable communication. Specialized techniques and equipment are often required to mitigate these challenges and ensure effective wireless communication in such environments.
Justification on point #2 (as motivating argument #2):
The distribution of mining carts (which we assume are equipped with UE functionality and called robots) in a mining tunnel typically involves strategic planning and organization to ensure efficient transportation of materials and resources within the mine. Mining carts, also known as mine cars or skips, are used to transport extracted ore, waste, or other materials from the mining face to the surface or processing area. The distribution process involves several key considerations:
1) Loading and Filling: Mining carts are loaded with ore or other materials at the mining face. Proper loading ensures maximum utilization of the cart's capacity while maintaining safety standards.
2) Transportation Routes: Mining tunnels are designed with specific transportation routes for the mining carts. These routes are planned to optimize the movement of carts, minimize congestion, and ensure a smooth flow of materials within the mine.
3) Track Systems: Mining carts often run on track systems embedded in the tunnel floor. These tracks guide the carts, preventing derailments and ensuring stable movement along the designated routes.
4) Automated Systems: In modern mining operations, automated systems, such as conveyor belts or autonomous vehicles, might be used to transport materials. These systems can enhance efficiency and reduce the need for manual distribution of mining carts.
5) Monitoring and Control: Mining companies use monitoring systems to track the movement of mining carts. Sensors and communication technologies are employed to monitor the location, load capacity, and maintenance needs of the carts. This data helps optimize the distribution process and prevent bottlenecks.
6) Safety Protocols: Safety is paramount in mining operations. Adequate safety protocols, including signaling systems, speed limits, and emergency procedures, are in place to ensure the safe distribution of mining carts and the protection of workers in the tunnels.
Overall, the distribution of mining carts in mining tunnels requires careful planning, efficient logistics, and the integration of technology to optimize the transportation of materials and maintain a safe working environment.
It is commonly agreed that carts cannot remain outside the designated pathway space of the tunnel unless unintended events occur.
Characteristics of the ”set of UE relays” consisting of robots in mining tunnel:
1) Serially distributed;
2) the random variable of inter-node distance depends the mining operation strategy
3) if the communication nodes (e.g., carts with UE functionality) are available to use non-conventional (or advanced) communication/networking schemes, such as cooperative diversity (or cooperative communication/networking) and network coding, the topology of the network is not completely serial but is a combined one.
Fig. 5.8.1.1-4: (a) conventional, serially connected multihop network (b) non-conventional, serially connected multihop network with a supplementary link supported (e.g., when cooperative diversity is applied). |
3239937b55fdc406849fa93870986364 | 22.916 | 5.8.1.2 Enhanced support of communnications and sensing features: | Mining as a whole takes place in extreme environments under the ground surfaces (e.g., search, monintor, preparation, processing, maintenance, repair shop, mining actuation (i.e., drilling operations), loading and underground delivery to off-surface station) and some tasks require their completion on the ground surface.
According to the US Energy Information Administration (www.eia.gov, as of 2021), the average number of employees at underground and surface mines differs from one State to another: 755 at underground and 272 at surface mines in Pennsylvania; 2103 at underground in West Virgina (Northern), 184 at underground and 36 at surface mines in West Virgina (Southern). Along a single or multiple tunnels, a set of tandem communication sub-networks can be formed as dipicted in Fig. 5.8.1.2-1. Each sub-network might include a 3GPP UE-type entity that only requires an intermittant communications (e.g., Ambient IoT device).
NOTE: It is assumed that a set of multihop UE relays consists of a group of service robots that are performing specific task(s) with (autonomous or tele-operated) physical mobility inside the mining job site. However, non-robot type of UE’s can also be part of a set of multihop UE relays.
Fig. 5.8.1.2-1 Some examples of set of multihop UE relays of robots at underground mines. |
3239937b55fdc406849fa93870986364 | 22.916 | 5.8.2 Related existing service requirements | Clock synchronisation: 3GPP TS 22.104
• clause 5.6.1 Clock synchronisation service level requirements
• clause 5.6.2 Clock synchronisation service performance requirements
• clause 7.2.3.2 Clock synchronisation requirements
NOTE 1: The MEC scenario described in 5.X.1 assumes collaboration among the group of aerial robots. The data collection and sensor fusion aspects (in clause 5.3 and in Annex A – Levels of Fusion) are still important considerations for this MEC scenario.
NOTE 2: The types of sensor data and media that robots are collecting, pre-processing and sharing with each other and/or with edge cloud (or edge server, cloud server) are related to the need of fulfilling the above sets of requirements.
Multi-path relay: 3GPP TS 22.261
• clause 6.9.2.1 support of a traffic flow of a remote UE via different indirect network connection paths
Positioning: 3GPP TS 22.261
• clause 7.3.2 High accuracy positioning performance requirements (see also clause 5.7.1 of 3GPP TS 22.104 for Factory of the Future scenario)
Efficient user plane: 3GPP TS 22.261
• Clause 6.5 Efficient user plane
Service continuity: 3GPP TS 22.263
• clause 5.5 Service continuity, when required by the application
Multi-hop connectivity: 3GPP TS 22.261
Energy efficiency: 3GPP TS 22.261
Integrated Sensing and Communication features: 3GPP TS 22.261, TS 22.137 (new). |
3239937b55fdc406849fa93870986364 | 22.916 | 5.8.3 Challenges and potential gaps | The following applicable aspects are identified and recommended for further study and can be further considered with other ongoing or recently completed Studies if applicable.
[CPG-5.8.3-001] 5G system is expected to provide a suitable means that can support an advanced energy-efficient mechanism for a group of robots that formed a set of multihop UE relays.
[CPG-5.8.3-002] 5G system is expected to provide a suitable means for robots that formed a set of multihop UE relaysto gain access to MEC service in order to determine whether or not to use certain advanced energy-efficient mechanism for a group of robots that formed a set of multihop UE relays.
[CPG-5.8.3-003] 5G system is expected to provide a suitable means for robots within a set of multihop UE relaysto timely reselect another energy-efficient mechanism for the set of multihop UE relayswhen communication is disrupted (e.g., connection loss, the existing energy-efficient mechanism becomes unavailable due to remarkable changes of inter-robot distance).
NOTE 1: The above CPGs are intended to ensure that a group of robots as a UE should have advanced energy-efficient mechanisms that are relevant to existing and new spectrum bands for 3GPP systems (e.g., cooperative diversity, full duplex, network coding) [26]. However, studies on such candidate mechanisms are up to the relevant Working Group within 3GPP, which is outside the scope of stage-1 study.
NOTE 2: The above CPGs are intended for a group of robots as a UE that are working under extreme conditions (e.g., at underground mines)
NOTE 3: In [CPG-5.8.3-002], examples of “gain access to MEC service” includes requesting distributed compute service (e.g., via edge or cloud) for certain task that is computationally intensive, such as scheduling, whether or not to use certain advanced mechanisms. |
3239937b55fdc406849fa93870986364 | 22.916 | 6 Other considerations | |
3239937b55fdc406849fa93870986364 | 22.916 | 6.1 TACMM aspects related to robot applications | |
3239937b55fdc406849fa93870986364 | 22.916 | 6.1.1 General description | 3GPP Release 18 stage-1 study on tactile and multimodality communication (TACMM) [28] involves identifying potential service and performance requirements for efficient data transmission, coordinations across devices (e.g., haptic glove, haptic wear, and so on as a 5G UE) and networks. These advancements have the potential to revolutionize how we interact remotely, making communication more immersive, expressive, and accessible.
NOTE 1: In the Release 18 TACMM study, the scope was focused on providing multimodality communication between a UE and a 5G network with no intermediate nodes acting as a relay. In a multi-hop connections scenario, certain devices (such as gloves, goggles) are connected to 5G network via a 5G UE. In this case, the devices are not considered as 5G UEs. Devices (gloves, goggles, etc.) are only considered as 5G UEs when they are directly connected to 5G network as shown in Fig. 6.1.1-1. It is observed that multi-hop aspects are not considered in the consolidated KPI table. However, some multi-hop aspects are captured in 3GPP TS 23.501, clause 5.37.2.
Fig. 6.1.1-1: Example of topology studied within 3GPP Release 18 TACMM (single hop connections only). Source: Fig. 5.1.2-1 of 3GPP TR 22.847.
The following user cases from TACMM study (TR 22.847) are related to robot applications.
• 5.2 Remote control robot
• 5.4 Support of skillset sharing for cooperative perception and manoeuvring of robots
• 5.5 Haptic feedback for a personal exclusion zone in dangerous remote environments
• 5.8 Virtual factory
UC - Remote control robot
Refer to the next UC.
UC - Support of skillset sharing for cooperative perception and manoeuvring of robots
Automated vehicles (robots) mostly rely on individual controllers, which can be limiting in unpredictable settings. Efforts by 3GPP have explored LTE-based V2V features, but challenges persist in unstructured contexts. Sharing sensor info and maneuvers is a solution, facilitated by Tactile Internet for fast data exchange between nearby vehicles. This enables cooperative perception and maneuvering, enhancing safety and prediction. Tactile Internet allows vehicles to collectively perceive their surroundings through shared local and remote maps, extending sensing range and prediction. New network architectures are needed for ultralow-latency connections based on Tactile Internet, supporting cooperative driving. This addresses scenarios like local delivery robots, enhancing communication for skill sharing.
Fig. 5.4.4-1 (in 3GPP TR 22.847) presents examples of benefit of using skillset sharing where the skillset is in the form of multimodality information and/or control. These scenarios are suitable for a robot group operation scenario using ProSe and Uu interface (e.g., a group of tele-operated robots, such as for hazardous control, mining (both underground and on-surface consolidated operations).
UC - Haptic feedback for a personal exclusion zone in dangerous remote environments
The advent of 5G networks has enabled tele-operations in industries like mining. To ensure safety, wearables like belts and shoe soles are used to improve alarm reliability, especially in situations where traditional alarms might be missed due to protective gear. Haptic feedback, detected faster by the brain than audio or visual cues, is integrated into wearables for quicker alerts in hazardous environments. A multi-modal approach combines haptic, audio, and visual signals for effective emergency response.
Surveillance cameras, drones, and wearables monitor environments, with data sent to a control unit for hazard prediction. Personal exclusion zones, restricting access, are defined and navigation is adjusted based on worker data. In mining, a hazard triggers a multi-modal alarm system, combining haptic, audio, and visual alerts, as well as surveillance and worker feedback, to guide workers away from danger.
This use case has identified the following PRs for single-hop cases:
[PR. 5.5.6-1] The 5G network shall support a mechanism to allow an authorized 3rd party to provide QoS policy for flows of multiple UEs associated with an application. The policy may contain e.g. the expected 5GS handling and the associated triggering event.
[PR. 5.5.6-2] The 5G system shall support a mechanism to apply QoS policy for flows of multiple UEs associated with an application received from an authorized 3rd party.
UC - Virtual Factory:
A factory utilizes a 5G network to synchronize its robot's motion data and monitor video. The robot's movements are collected through a dataflow (say, dataflow 1), while a monitor captures real-time motion via another dataflow (say, dataflow 2). The network categorizes these flows as instructed. Dataflow 1 generates VR video on a VR server based on motion data, while dataflow 2 is directly sent to a remote screen. At the remote site, VR glasses receive VR video, and a monitor displays the actual robot motion. To minimize delay, dataflow 2 is briefly held by the network as motion data is processed into VR video. This setup ensures seamless coordination between robot motion and video monitoring, facilitated by 5G technology.
This use case has identified the following PRs for single-hop cases:
[PR 5.8.6-1] 5G system shall be able to support the interaction with applications on UEs or data flows grouping information within one tactile and multi-modal communication service.
[PR 5.8.6-2] The 5G system shall support a means to apply 3rd party provided policy(ies) for flows associated with an application. The policy may contain e.g. the set of UEs and data flows, the expected QoS handling and associated triggering events, other coordination information.
NOTE 2: The policy can be used by a 3rd party application for coordination of the transmission of multiple UEs’ flows (e.g., haptic, audio and video) of a multi-modal communication session. |
3239937b55fdc406849fa93870986364 | 22.916 | 6.1.2 Challenges and potential gaps | The PRs referenced above are limited to a single-hop scenario. To extend to a multi-hop scenario, the following can be further considered with other ongoing or recently completed studies if applicable.
[CPG-6.1.2-001] 5G system is expected to be evolved so that it can support some or all of the referenced PRs in both direct network connections and indirect network connections.
NOTE: The suitable number of hops that are required to support depends on the types and characteristics of robot applications. |
3239937b55fdc406849fa93870986364 | 22.916 | 6.2 ISAC aspects related to robot applications | |
3239937b55fdc406849fa93870986364 | 22.916 | 6.2.1 General description | Autonomous Mobile Robots (AMRs) are revolutionizing logistics operations in industries such as manufacturing and warehousing. Unlike traditional Automated Guided Vehicle (AGV) systems, AMRs possess the intelligence to navigate and perform tasks autonomously, offering unparalleled flexibility. However, a significant challenge for AMRs lies in acquiring accurate and continuous sensing data as they move through complex environments [29,30]. Factors like unexpected obstacles or sudden appearances of people and machines can compromise their safety.
To address this challenge, 5G base stations are strategically deployed in factories. These stations serve a dual purpose: not only do they provide essential communication capabilities for factory equipment, but they also act as sensing nodes. By transmitting sensing signals and receiving reflected signals, these base stations capture real-time 3GPP sensing data. This data is then processed and analyzed by the core network, generating valuable insights into the AMRs' surroundings.
Crucially, these sensing results can be shared with trusted third-party platforms, empowering AMRs with a comprehensive understanding of their environment. This enhanced sensing capability allows AMRs to adapt their routes dynamically, ensuring both efficiency and safety. Moreover, in scenarios where obstacles obstruct radio signals or AMR paths traverse indoor and outdoor spaces, collaboration between multiple base stations further improves sensing accuracy and service continuity.
[30] presents a few use cases addressing sensing issues related to robot operation. Examples include the following with some KPIs as shown in, e.g., Table 5.23.6-1 and Table 5.32.6-1:
[PR 5.23.6-1] The 5G system shall be able to provide the continuity of sensing service for a specific target object, across indoor and outdoor.
[PR 5.23.6-2] The 5G system shall be able to provide a secure mechanism to ensure sensing result data privacy within the sensing service area.
[PR 5.32.6-1] Based on operator’s policy, the 5G system may provide a mechanism for a trusted third party to provide sensing assistance information about a sensing target. |
3239937b55fdc406849fa93870986364 | 22.916 | 6.2.2 Challenges and potential gaps | While the PRs referenced above have addressed potential new requirements, such as on the continuity of sensing services both in indoor and outdoor settings, data privacy and network exposure, it is still worthwhile to consider the support of scalable and efficient use of communication resources needed for stable operation of multiple service robots especially when a large number of service robots are present. |
3239937b55fdc406849fa93870986364 | 22.916 | 6.3 Metaverse aspects related to robot applications | |
3239937b55fdc406849fa93870986364 | 22.916 | 6.3.1 General description | The Technical Report on Localized Mobile Metaverse Services [31] addressed several use cases related to robot applications. The following includes some examples.
1) Metaverse use case on Spatial Mapping and Localization (Clause 5.5, [31])
In the context of robot operations, spatial mapping involves constructing or updating a map of an unknown location, while localization tracks an object to identify its location and orientation over time. For the localized mobile Metaverse use case (e.g., clause 5.5, [31]), communication technology support is vital. Spatial Mapping Service creates and maintains a 3D map of indoor or outdoor environments, enabling the identification of stationary and moving objects. This spatial map is utilized by Spatial Localization Service, allowing for accurate positioning and orientation of users.
Spatial mapping gathers sensing data to create a detailed 3D map, integrating information from various sources like sensors and architectural specifications. Localization, based on this spatial map, identifies users' positions and viewing angles in 3D space. Examples of spatial mapping applications include government projects mapping entire cities, navigation service providers mapping roads, and customers mapping indoor spaces. To achieve this, vehicles or robots equipped with multiple cameras and LiDAR devices capture images and depth information. This data is then processed, enhancing location accuracy and enabling various applications like visual positioning and Metaverse content management. The information is communicated through uplink sensor data, enabling precise localization and enriching the spatial internet experience.
The following includes some example of potential new requirements:
[PR 5.5.6.1-1] Subject to operator policy and relevant regional and national regulation, the 5G system shall support mechanisms for an authorized UE to provide sensing data that can be used to produce or modify a spatial map.
[PR 5.5.6.1-2] Subject to operator policy, user consent and relevant regional and national regulation, the 5G system shall support mechanisms to receive and process sensing data to produce or modify a spatial map.
[PR 5.5.6.1-3] Subject to operator policy and relevant regional and national regulation, the 5G system shall support mechanisms to expose a spatial map or derived localization information from that map to authorized third parties.
NOTE 1: Some KPI’s required to fulfil e.g., [PR 5.5.6.1-2] is not specified in this particular use case.
2) Metaverse use case on Immersive tele-operated driving (Clause 5.20, [31])
The use case involves operating vehicles, lifting devices, or machines/robots in hazardous industrial environments, where manual operation poses risks due to exposure to dangerous materials, extreme conditions, or radioactivity. While Automated Guided Vehicles (AGVs) exist, the proposal suggests leveraging 5G technology to create a system allowing remote users to control these devices. This control occurs through an immersive cockpit displayed on a virtual reality head-mounted display and haptic gloves. The cockpit integrates data from the digital twin of the operating environment, including information from factory sensors and the surroundings. This approach enhances user safety and operational accuracy by merging data from the digital twin, enabling remote control in hazardous settings.
This use case includes some requirements that are needed to support immersive teleoperations (or digital twins) within a service area up to 10km (radius).
[PR 5.20.6-1] The 5G system shall be able to provide a means to associate data flows related to one or multiple UEs with a single digital twin maintained by the mobile metaverse service.
[PR 5.20.6-2] The 5G system shall be able to provide a means to support data flows from one or multiple UEs to update a digital twin maintained by the mobile metaverse service. |
3239937b55fdc406849fa93870986364 | 22.916 | 6.3.2 Challenges and potential gaps | None. |
3239937b55fdc406849fa93870986364 | 22.916 | 6.4 High-level communication scenarios | Robotic Application Enablement Communication (in short Robotic Communication) is a technology that enables a group of robots (or a robot) to communicate with various entities, including other robots (or robotic applications), humans (e.g., residents, workers, pedestrians in indoor/ outdoor structured or unstructured environments), infrastructure, and networks. Here are several scenarios for Robotic Communication.
NOTE 1: A robot with communication functionality is a physical entity that can act as an ordinary UE, a UE acting as a relay for other UE (e.g., UE Relay) that is fixed or mobile depending on scenarios.
NOTE 2: In the following scenarios, the communication direction is assumed to be unidirectional or bidirectional.
NOTE 3: In the following scenarios, the communication path is assumed to be a single hop or multi-hop. In a multi-hop scenario, a combination of different scenarios are possible when applicable, e.g., R2R and R2C.
NOTE 4: In some of the following scenarios, the counterpart of a robot (or a group of robots) in communication path can be an IoT device, an ordinary UE, a UE acting as a relay, and so on.
1. Robot-to-Robot (R2R) Communication:
- Collision Avoidance: Robots can exchange information about their speed, direction, and position to avoid collisions, especially in intersections or blind spots.
- Traffic Congestion Management: R2R communication allows robots to share real-time traffic information, helping them choose less congested routes. Both road traffic (e.g., in public roads) and robot traffic are considered.
- Examples:
• A group of robots for public safety purposes.
• Indoor and outdoor delivery robots.
• Internet of aerial robots (or Internet of drones).
• A group of robots using machine-type media communications (e.g., media between robots).
• Robots maneuvering in unstructured environments or in structured environments such as public roads, sidewalks.
• In the United States, currently, robot operations lack national regulation and are governed by individual states. Virginia set a precedent in 2017 by regulating robot operation, allowing them to move on sidewalks and crosswalks at speeds not exceeding 10 mph (ca. 16km/h). If these paths are unavailable, robots are permitted to operate on the roadside with a speed limit of 25 mph (ca. 40km/h). In 2020, Pennsylvania enacted regulations allowing robots weighing up to 550 pounds (ca. 250kg) without payload [29].
2. Robot-to-Infrastructure (R2I) Communication:
- Traffic Signal Coordination: Robots can receive data from traffic signals to optimize their speed and reduce unnecessary stops, improving traffic flow.
- Roadside Assistance: Remote drivers can receive alerts and assistance information, such as nearby charging stations or emergency services, from roadside infrastructure.
NOTE 5: For example, in some States of the United States where robots are legally considered as road vehicles, such robots are assumed to be capable of supporting communication features relevant to Cellular V2X (e.g., LTE-V2X, NR-based V2X).
- Examples:
• Indoor and outdoor delivery robots.
• Internet of aerial robots (or Internet of drones).
• Car valet robot at parking lots
• A group of robots using machine-type media communications (e.g., media from a robot to infrastructure).
• A group of aerial roots (e.g., UAVs) performing geo-surface sensing and/or environmental monitoring.
3. Robot-to-Pedestrian (R2P) Communication:
- Pedestrian/Human Safety: Robots can detect pedestrians/human equipped with or without communication devices, issuing warnings to both robot operators and pedestrians/human to prevent accidents or collision.
- Crosswalk and human-zone Safety: Pedestrians/humans can receive alerts on their devices (UE’s, tethered/untethered devices) about approaching robots, ensuring timely safety-related alert.
- Examples:
• Surveillance robot (e.g., in CCRC)
• Disinfection robot (e.g., in hospitals, hotels)
• Outdoor delivery robots allowed to use public roads some of which are shared by humans.
• Personalized assistive robots: personalized digital experience (e.g., VR/XR/MR assisted shopping, gaming). Both physical robots and virtual bots are considered where virtual bots are assumed to be linked to authorized multi-sensory input (e.g., acoustic signals, voice, audio, or visual input such as facial expressions, gesture, and/or gait) and display mechanisms that are security and privacy ensured.
4. Robot-to-Network (R2N) Communication:
- Robot/road Traffic Management: Traffic authorities can collect real-time data from robots to monitor traffic patterns, analyze congestion, and adjust traffic signals for optimal flow. See NOTE 5 for outdoor delivery robots that are considered as a regular vehicle depending on local regulatory requirements.
- Emergency Response: Robots can transmit information about accidents or road hazards directly to emergency services, enabling faster response times.
- Timely Response: Robots that are in inter-continental real-time trading environments (e.g., trans-Atlantic financial market), realistic mixed-reality gaming and entertainment, and so on.
NOTE 6: the term `Network’ can include 3GPP NTN and TN. Different from R2I scenarios, R2N scenarios include communication with a server. Also, refer to V2I (clauses 4, 5.6, 5.7, and 5.8 in TR 22.185 and V2N (clauses 5.15, 5.26, and 5.27 of TR 22.185).
5. Robot-to-Cloud (R2C) Communication:
NOTE 7: This communication scenario describes the use of edge or cloud for, e.g., AI/ML-related computation that are commonly necessary to support intelligent operations of a robot or a group of robots. This scenario can be overlaid with other scenarios such as R2N or R2I.
- Fleet Management: Fleet operators can track robots, monitor their health, and optimize routes, leading to efficient operations and reduced maintenance expenses.
- Over-the-Air (OTA) Updates: Robot manufacturers can remotely deploy software updates and patches to improve robot performance and security.
- Examples:
• A group of robots in healthcare, delivery, and/or manufacturing environments.
• Tele-operated robots with haptic feedback/control.
6. Robot-to-Grid (R2G) Communication:
- Smart Charging: Electric robots can communicate with the grid to schedule charging during off-peak hours, optimizing energy usage and reducing costs for both consumers and utilities.
- Grid Stability: Electric robots can provide feedback to the grid about available battery capacity, enabling the grid to balance demand and supply effectively.
- Examples:
• A group of robots in smart city settings, in hazardous control environments.
7. Robot-to-Home (R2H) Communication:
- Home Automation: Robots can communicate with smart home devices, allowing users to control home appliances, lighting, and security systems remotely from their robots.
- Energy Management: Electric robots can supply power back to homes during peak demand periods, reducing the load on the grid and lowering household electricity expenses.
- Examples:
• A group of robots in smart home that can provide timely and accurate information for the smart home server for high-quality decision-making. |
3239937b55fdc406849fa93870986364 | 22.916 | 7 Conclusions and recommendations | The present document includes the outcome of Study on Network of Service Robots with Ambient Intelligence. This includes various use cases with challenges and potential gaps and other consideration points related to efficient communications service and cooperative operation for a group of service robots.It is recommended that future work can consider the present document for enhancements of communications support relevant for the network of service robots. Annex A: Levels of Fusion A.1 Levels of Fusion A.1.0 Description The use of “levels of fusion” [14,15,16]is expected to help the United Nations Sustainability Development Goals (SDGs) in several aspects. Providing that the 6G technology enablers are designed in resource-efficient ways for various types of resources (e.g., radio resources, network resources, material, such as battery related), such considerations can also help provide affordable 6G services in the society, especially when certain groups of residents, patients, public-safety officers, or underrepresented need the communication services the most at a critical point in time in their everyday living. A.1.1 A Six-Level Approach This approach is based on U.S. Department of Defense Joint Directors of Laboratories (JDL) Data Fusion Subgroup. Each of the following six levels of fusion progressively adds meaning at higher levels of abstraction and involves more analysis (Reference: NIST [14]): Level 0 - organize. This is the initial processing accomplished at or near the sensor that organizes the collected data into a usable form for the system or person who will receive it. Level 1 - identify/correlate. This level takes new input and normalizes its data; correlates it into an existing entity database, and updates that database. Level 1 Fusion tells you what is there and can result in actionable information. Level 2 - aggregate/resolve. This level aggregates the individual entities or elements, analyzes those aggregations, and resolves conflicts. This level captures or derives events or actions from the information and interprets them in context with other information. Level 2 Fusion tells you how they are working together and what they are doing. Level 3 - interpret/determine/predict. Interprets enemy events and actions, determines enemy objectives and how enemy elements operate, and predicts enemy future actions and their effects on friendly forces. This is a threat refinement process that projects current situations (friendly and enemy) into the future. Level 3 Fusion tells you what it means and how it affects your plans. Level 4 - assess. This level consists of assessing the entire process and related activities to improve the timeliness, relevance, and accuracy of information and/or intelligence. It reviews the performance of sensors and collectors, as well as analysts, information management systems, and staffs involved in the fusion process. This process tells you what you need to do to improve the products from Fusion Levels 0-3. Level 5 - visualize. This process connects the user to the rest of the fusion process so that the user can visualize the fusion products and generate feedback/control to enhance/improve these products. A.1.2 A Three-Level Approach A three-level approach was proposed in [15] as follows. In terms of data processing, multi-modal fusion is typically implemented at three different levels of abstractions: Sensor level: Fusion module processes raw data captured from different sources. If multiple sensors are measuring the same physical aspect, a single feature vector to represent the phenomena under analysis can be directly combined. However, if sensor data represents different phenomena, the data fusion should be completed in a later stage. Feature level: Sensor data is represented by feature vectors. Features are extracted from different sources independently. The fusion module then combines the feature vectors from each module into a single fused feature vector. Decision level: Features are extracted from each source independently and passed to a corresponding classification module to make their own decision. The Fusion module then consolidates these decisions into one final classification decision. Hybrid models: These models can include more than one level of Fusion. For example, features from two modalities can be combined together to construct one feature set for classification model, the decisions of this classification model are then combined with decisions from a second classification model that is trained using features from a third modality. Fig. A.1.2: A three-level approach for data fusion (source: [15] IEEE). Annex B: Example Scenario B.1 Example Scenario for robot applications to adjust the accuracy level of clock synchronization provided by 5G system Energy-efficient robots collaborate to jointly work on certain task in an area of interest such as cleaning, disinfection, and agriculture. They adapt actions based on environmental conditions, emphasizing accuracy while optimizing computing and communication resources. The application server is aware that a higher accuracy level of clock synchronization is reqired in a more complex enviroment of interest (e.g. Area 1 in Figure B.1-1). The application requests the moniroting of the accuracy level for clock synchninization to the 5G system, and the 5G system can perform the monitoring based on the application request and report the measured results. In addition, the 5G system can inform the application server of its accuracy checking capability. The application server can request the 5G system to adjust the accuracy level to in order to maintain the proper accuracy level needed for robot applications in the particular area. Figure B.1-1: Example scenario adjusting clock synchronization accuracy level Table 5.6.2-1 of 3GPP TS 22.104 [2] presents various scenarios that require different accuracy levels of clock synchronization. In practice, there are occasions in which robots need adaptation to changes on the need of accuracy level of clock synchronization (e.g., due to changes in robot task or in the environment) or 5G system needs to inform of its change on its capability on the accuracy level to robots. Annex C: Change history Change history Date Meeting TDoc CR Rev Cat Subject/Comment New version 2022-08 SA1#99-e S1-222282 TR skeleton, scope 0.1.0 2022-11 SA1#100 S1-223575 S1-223724 Online cooperative high-resolution 3D map building, Real-time cooperative safety protection 0.2.0 2023-02 SA1#101 S1-230352 S1-230354 S1-230355 S1-230356 S1-230382 S1-230383 S1-230398 Terminology n SOBOT Patrol robots in CCRC Real-time conversational robot Multimodal Sensors Fusion in Multi-Robot Scenarios Annex: Fusion Levels for Robotic Applications Machine-type media communication Update on cooperative safety protection 0.3.0 2023-05 SA1#102 S1-231535 S1-231548 S1-231549 S1-231802 S1-231720 S1-231547 Editorial Update on 5.1 Update on 5.2 Update on 5.3 MEC for Efficient Management of Geo-surface Sensing Data Using a Group of Aerial Robots Smart community with service robots 0.4.0 2023-08 SA1#103 S1-232518 Inclusion of: S1-232089; S1-232519; S1-232506; S1-232520; S1-232521; S1-232516 0.5.0 2023-11 SA1#104 S1-233260 Inclusion of: S1-233356; S1-233363; S1-233358; S1-233354; S1-233355; S1-233362; S1-233192; S1-233366 0.6.0 2023-12 SA#102 SP-231399 MCC Clean-up for presentation to one-step approval 1.0.0 2023-12 SA#102 - Approved by SA#102 19.0.0 |
349ca2cc0adaee1226ea2ffcee1cba56 | 22.989 | 1 Scope | The present document analyses FRMCS Use cases, system principles of FRMCS and Interworking between GSM-R and FRMCS in order to derive potential requirements. |
349ca2cc0adaee1226ea2ffcee1cba56 | 22.989 | 2 References | The following documents contain provisions which, through reference in this text, constitute provisions of the present document.
- References are either specific (identified by date of publication, edition number, version number, etc.) or non‑specific.
- For a specific reference, subsequent revisions do not apply.
- For a non-specific reference, the latest version applies. In the case of a reference to a 3GPP document (including a GSM document), a non-specific reference implicitly refers to the latest version of that document in the same Release as the present document.
[1] 3GPP TR 21.905: "Vocabulary for 3GPP Specifications".
[2] 3GPP TS 36.213 V14.0.0, Technical Specification Group Radio Access Network; Evolved Universal Terrestrial radio Access (E-UTRA); Physical layer procedures, 2016.
[3] 3GPP TS 23.179 V13.3.0, Technical Specification Group Services and System Aspects; Functional architecture and information flows to support mission critical communication services; Stage 2. 2016.
[4] TTA TTAK.KO-06.0437, LTE Based Railway Communication System Requirements (Conventional and High Speed Railway), Dec. 2016.
[5] TTA TTAK.KO-06.0370, User Requirements for LTE-Based Railway Communication System, Oct. 2014.
[6] TTA TTAK KO-06.0-369, Functional Requirements for LTE-Based Communication System, Oct. 2014.
[7] Y.-S. Song, J. Kim, S. W. Choi, and Y.-K. Kim, “Long term evolution for wireless railway communications: Testbed deployment and performance evaluation,” IEEE Comm. Mag., Feb. 2016.
[8] J. Kim, S. W. Choi, Y.-S. Song, and Y.-K. Kim, “Automatic train control over LTE: Design and performance evaluation,” IEEE Comm. Mag., Oct. 2015.
[9] UNISIG Subset-041 ERTMS/ETCS Performance Requirements for Interoperability
[10] UIC FU-7100: “FRMCS User Requirements Specification”.
[11] UIC MG-7900: “FRMCS Use Cases”.
[12] UIC CODE 950: “EIRENE Functional Requirements Specification (FRS)”.
[13] UIC CODE 951: “EIRENE System Requirements Specification (SRS)”.
[14] 3GPP TR 22.990: “Study on off-network for rail”. |
349ca2cc0adaee1226ea2ffcee1cba56 | 22.989 | 3 Definitions, and abbreviations | |
349ca2cc0adaee1226ea2ffcee1cba56 | 22.989 | 3.1 Definitions | For the purposes of the present document, the terms and definitions given in 3GPP TR 21.905 [1] and the following apply. A term defined in the present document takes precedence over the definition of the same term, if any, in 3GPP TR 21.905 [1].
Automatic Train Operation (ATO): Automatic Train Operation applications are responsible for acceleration to the permitted speed, speed reduction where necessary due to speed restrictions and stop at designated stations in the correct location.
Automatic Train Protection (ATP): Automatic Train Protection applications are responsible for giving Limit of Movement Authority to a train based on the train’s current speed, its braking capability and the distance it can go before it must stop.
Balise: An electronic beacon or transponder placed between the rails of a railway as part of an automatic train protection or operation (ATP/ATO) system.
Business communication applications: communication applications that support the railway business operation in general, such as wireless internet, etc.
Controller (Train Controller): A Ground FRMCS User provided with special capabilities by the FRMCS System.
Driver (Train Driver): A Mobile FRMCS User provided with special capabilities by the FRMCS System.
External System(s): A general category of stationary FRMCS Users. For example, External Systems could be systems monitoring for trains passing a red light to initiate a railway emergency call.
FRMCS Application: The application on a 3GPP UE offering railway specific communication services to the FRMCS User by making use of the communication capabilities offered by the 3GPP UE and the 3GPP network.
FRMCS Equipment Identity: The identity by which a FRMCS equipment can be addressed.
FRMCS Equipment Type: Indicates the purpose the FRMCS equipment is being used for, FRMCS equipment of different equipment types do have different capabilities.
FRMCS Equipment: The FRMCS Equipment consists of a 3GPP UE and a FRMCS Application residing on it. It may be combined with legacy railway communication equipment (e.g. GSM-R or TRS)
FRMCS Functional Identity: The identity related to a user or related to the equipment, as specified in 9.3 "Role management and presence" indicating its special Role (e.g. as Driver of a specific train, usually a train number) can be addressed.
FRMCS Network: this is a sub-part of the FRMCS System.
FRMCS Roaming: The ability for a FRMCS User to make use of FRMCS Applications in a Visited (FRMCS) Network.
FRMCS System: The system providing railway specific communication constituted of the FRMCS Equipment, the 3GPP transport and the application servers in the network. Legacy networks are not included in the FRMCS System.
FRMCS User Identity: The identity by which a FRMCS User can be addressed.
FRMCS User: A human user or a machine making use of the railway specific communication. FRMCS Users can be connected via 3GPP RAT, wired connectivity or other radio technology
Ground FRMCS User: A general category of FRMCS Users that are predominantly stationary. Mostly they are connected via wired connectivity but may be using also wireless in certain conditions.
Home FRMCS Network: The Home FRMCS Network is the network in which the FRMCS User is engaged in a subscription.
Mobile FRMCS User: A general category of FRMCS Users that are mobile. Thus, they are connected via wireless connectivity all the time.
Mobile Intelligent Assistant: 5G enabled robot with autonomous movements and artificial intelligence to support passengers in the Railway Smart Station.
Off-Network communication: direct communication between FRMCS Users in proximity.
On-Network communication: indirect communication between FRMCS Users connected to FRMCS Network(s).
Performance communication applications: applications that help to improve the performance of the railway operation, such as train departure, telemetry, etc.
Radio Block Centre (RBC): A train sends its position and speed information periodically to the RBC. The RBC uses the received information to decide movement authority of the train.
Rail Infrastructure Manager: A company that owns or manages rail infrastructure; within this document the Rail Infrastructure Manager owns, administrates and operates the FRMCS Network.
Railway Smart Station: a train station where the 5G-based services such as IoT and AI, are used for providing assisting railway services.
Railways Undertaking: A company that offers train freight or passenger transportation services, making use of FRMCS network for their operational communication needs that is operated by a Rail Infrastructure Manager.
Role (Functional Role): The function a FRMCS User or a FRMCS Equipment is currently performing. Examples of Roles are Driver, Controller or shunting staff, etc. This is indicated by the FRMCS Functional Identity.
Shunting: manoeuvring trains in order to change their location or composition.
Trackside staff: Staff working as trackside maintenance and/or shunting members
Trainborne equipment: FRMCS Equipment which is physically embedded in train
Visited (FRMCS) Network: A Visited (FRMCS) Network can be either another FRMCS Network than the Home FRMCS Network, or a Public Land Mobile Network (PLMN).
Zone: A 2-dimensional region of a pre-determined size.
Zone resolution: The pre-determined size of the given zone. |
349ca2cc0adaee1226ea2ffcee1cba56 | 22.989 | 3.2 Abbreviations | For the purposes of the present document, the abbreviations given in 3GPP TR 21.905 [1] and the following apply.
An abbreviation defined in the present document takes precedence over the definition of the same abbreviation, if any, in 3GPP TR 21.905 [1].
ATO Automatic Train Operation
ATP Automatic Train Protection
AVC Assured Voice Communication
DoS Denial of Service
GNSS Global Navigation Satellite System
LMR Land Mobile Radio
MACN Multi Access Core Network
NA Naming Authority
OATP On-board Automatic Train Protection
PSAP Public Safety Answering Point
RBC Radio Block Centre
REC Railway Emergency Communication
TRS Trunked Radio System
WATP Wayside Automatic Train Protection |
349ca2cc0adaee1226ea2ffcee1cba56 | 22.989 | 4 Overview | The present document is an 800 series Technical Report (TR) written by 3GPP TSG SA WG1 (SA1). Such a TR is not normative, i.e. cannot be used for implementation directly and cannot be referred to by other 3GPP specifications. It was primarily written by SA1 to summarise the high-level communication needs of the railway community and to identify the corresponding requirements, which, in another step have been introduced into normative Technical Specifications (TS) of 3GPP.
An 800 series TR will not be updated when in the process of introduction of the requirements to TS changes are made to those requirements, i.e. the text of the requirements listed here in this TR will not be aligned with the requirements in the TS. Due to the fact that most of the requirements identified in this document were introduced in already existing Mission Critical Communication (MCX) TS, an alignment with the MCX terminology and functionality was made resulting in most of the requirements in here being reworded in the TS. Also, TS requirements changes due to future work affecting requirements stemming from this TR will not result in updates of this TR.
However, the columns “Comments” of the requirements tables listed below were updated to indicate the disposition of the requirement in the TS and most of the time summarising deviations and decisions taken when introducing those requirement into normative TS. By following these references into the normative TS the functionality provided by 3GPP for railway communication can be derived by the reader.
FRMCS will adapt 3GPP transport to provide communication to railway users. It eventually will resemble GSM-R and will additionally provide communication capabilities beyond what GSM-R has been able to. It will provide higher data rates, lower data latencies, multimedia communication, and improved communication reliability. FRMCS considers end-to-end use cases and also provides requirements that might or might not be in scope of 3GPP existing specifications. To facilitate smooth migration from legacy communication systems (e.g. GSM) to FRMCS, interworking requirements between legacy communication systems and FRMCS are provided.
FRMCS Equipment shall connect to application domain through 3GPP radio access or other access. It provides emergency group communication, low latency and high reliable data and video service in high speed train environment. Amongst others it has the following important features:
- Prioritized emergency group communication, train control data and video service
- Seamless connectivity in high speed railway moving environments
- Low latency and high reliable data and video service
- Real time train monitoring and management for safe train operation
- Reliable location tracking including in tunnel tracking
- Legacy railway communication interworking to GSM-R system
Figure 4-1: High-level relation of FRMCS and legacy systems
Basically, railway communication services [5] can be categorized into
- Train control services
- Maintenance services
- Railway specific services (such as Railway Emergency Call, functional addressing, and location-based addressing)
- Other services (providing train crews or train Drivers with information of train operation and interworking with the existing railway communication systems)
This study categorizes all the use cases by considering inherent characteristics of railway applications. Specifically, the following categories of use cases are considered.
- Basic functionality
- Critical communication applications
- Performance communication applications
- Business communication applications
- Critical support applications
- Performance support applications
- Business support applications
- FRMCS System principles
The categories can be depicted conceptually as follows:
Figure 4-2: Grouping of FRMCS Applications |
349ca2cc0adaee1226ea2ffcee1cba56 | 22.989 | 5 Basic functionality use cases | |
349ca2cc0adaee1226ea2ffcee1cba56 | 22.989 | 5.1 Introduction | The basic functionality use cases describe the behaviour of the FRMCS Equipment when powered up and down. For power up it takes the already powered up 3GPP UE as starting point conversely the same applies for power down. |
349ca2cc0adaee1226ea2ffcee1cba56 | 22.989 | 5.2 Device power on and shut-down related use cases | In this chapter the use cases related to the function Initialisation and shut-down are defined.
- Power on the UE
- Access to the FRMCS System
- Controlled power down UE
- Uncontrolled power down UE
Note: Annex A provides examples of Role management (such as functional identities or FRMCS Equipment Identities) in the railway environment. |
349ca2cc0adaee1226ea2ffcee1cba56 | 22.989 | 5.3 Use case: Power on the UE | |
349ca2cc0adaee1226ea2ffcee1cba56 | 22.989 | 5.3.1 Description | This use case provides the user with a powered-on UE. |
349ca2cc0adaee1226ea2ffcee1cba56 | 22.989 | 5.3.2 Pre-conditions | The UE is switched off.
Note: In this use case and all the following it is assumed the UE contains a FRMCS Application, thus an UE with FRMCS Application is further referred to as FRMCS Equipment. |
349ca2cc0adaee1226ea2ffcee1cba56 | 22.989 | 5.3.3 Service flows | Successful self-test
The user switches on the UE.
The FRMCS Application performs a self-test. If the test is successful, the user is informed about this.
Unsuccessful self-test
The user switches on the UE.
The FRMCS Application performs a self-test. If the test is not successful, the user is informed about this. |
349ca2cc0adaee1226ea2ffcee1cba56 | 22.989 | 5.3.4 Post-conditions | The UE is switched on and attached to a 3GPP network following normal 3GPP defined network selection procedures but not logged into any FRMCS System. The user is informed about the results of the self-test. |
349ca2cc0adaee1226ea2ffcee1cba56 | 22.989 | 5.3.5 Potential requirements and gap analysis | Reference Number
Requirement text
Application / Transport
SA1 spec covering
Comments
[R-5.3-001]
The FRMCS Application shall be capable to perform a self-test and inform the user about the results.
A
N/A
Implementation requirement |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.