<?xml version='1.0' encoding='utf-8'?>
<!DOCTYPE rfc [
  <!ENTITY nbsp    "&#160;">
  <!ENTITY zwsp   "&#8203;">
  <!ENTITY nbhy   "&#8209;">
  <!ENTITY wj     "&#8288;">
]>
<?xml-stylesheet type="text/xsl" href="rfc2629.xslt" ?>
<!-- generated by https://github.com/cabo/kramdown-rfc version 1.7.29 (Ruby 2.6.10) -->
<rfc xmlns:xi="http://www.w3.org/2001/XInclude" ipr="trust200902" docName="draft-kpugin-rush-03" category="info" tocInclude="true" sortRefs="true" symRefs="true" version="3">
  <!-- xml2rfc v2v3 conversion 3.28.1 -->
  <front>
    <title abbrev="rush">RUSH - Reliable (unreliable) streaming protocol</title>
    <seriesInfo name="Internet-Draft" value="draft-kpugin-rush-03"/>
    <author initials="K." surname="Pugin" fullname="Kirill Pugin">
      <organization>Meta</organization>
      <address>
        <email>ikir@meta.com</email>
      </address>
    </author>
    <author initials="N." surname="Garg" fullname="Nitin Garg">
      <organization>Meta</organization>
      <address>
        <email>ngarg@meta.com</email>
      </address>
    </author>
    <author initials="A." surname="Frindell" fullname="Alan Frindell">
      <organization>Meta</organization>
      <address>
        <email>afrind@meta.com</email>
      </address>
    </author>
    <author initials="J." surname="Cenzano" fullname="Jordi Cenzano">
      <organization>Meta</organization>
      <address>
        <email>jcenzano@meta.com</email>
      </address>
    </author>
    <author initials="J." surname="Weissman" fullname="Jake Weissman">
      <organization>Meta</organization>
      <address>
        <email>jakeweissman@meta.com</email>
      </address>
    </author>
    <date year="2025" month="April" day="21"/>
    <area>General</area>
    <workgroup>TODO Working Group</workgroup>
    <keyword>Internet-Draft</keyword>
    <abstract>
      <?line 50?>

<t>RUSH is an application-level protocol for ingesting live video.
This document describes the protocol and how it maps onto QUIC.</t>
    </abstract>
    <note removeInRFC="true">
      <name>Discussion Venues</name>
      <t>Discussion of this document takes place on the
    mailing list (),
  which is archived at <eref target=""/>.</t>
      <t>Source for this draft and an issue tracker can be found at
  <eref target="https://github.com/afrind/draft-rush"/>.</t>
    </note>
  </front>
  <middle>
    <?line 55?>

<section anchor="introduction">
      <name>Introduction</name>
      <t>RUSH is a bidirectional application level protocol designed for live video
ingestion that runs on top of QUIC.</t>
      <t>RUSH was built as a replacement for RTMP (Real-Time Messaging Protocol) with the
goal to provide support for new audio and video codecs, extensibility in the
form of new message types, and multi-track support. In addition, RUSH gives
applications option to control data delivery guarantees by utilizing QUIC
streams.</t>
      <t>This document describes the RUSH protocol, wire format, and QUIC mapping.</t>
    </section>
    <section anchor="conventions-and-definitions">
      <name>Conventions and Definitions</name>
      <t>The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD",
"SHOULD NOT", "RECOMMENDED", "NOT RECOMMENDED", "MAY", and "OPTIONAL" in this
document are to be interpreted as described in BCP 14 <xref target="RFC2119"/> <xref target="RFC8174"/>
when, and only when, they appear in all capitals, as shown here.</t>
      <dl>
        <dt>Frame/Message:</dt>
        <dd>
          <t>logical unit of information that client and server can exchange</t>
        </dd>
        <dt>PTS:</dt>
        <dd>
          <t>presentation timestamp</t>
        </dd>
        <dt>DTS:</dt>
        <dd>
          <t>decoding timestamp</t>
        </dd>
        <dt>AAC:</dt>
        <dd>
          <t>advanced audio codec</t>
        </dd>
        <dt>NALU:</dt>
        <dd>
          <t>network abstract layer unit</t>
        </dd>
        <dt>VPS:</dt>
        <dd>
          <t>video parameter set (H265 video specific NALU)</t>
        </dd>
        <dt>SPS:</dt>
        <dd>
          <t>sequence parameter set (H264/H265 video specific NALU)</t>
        </dd>
        <dt>PPS:</dt>
        <dd>
          <t>picture parameter set (H264/H265 video specific NALU)</t>
        </dd>
        <dt>ADTS header:</dt>
        <dd>
          <t><em>Audio Data Transport Stream Header</em></t>
        </dd>
        <dt>ASC:</dt>
        <dd>
          <t>Audio specific config</t>
        </dd>
        <dt>GOP:</dt>
        <dd>
          <t>Group of pictures, specifies the order in which intra- and inter-frames are
arranged.</t>
        </dd>
      </dl>
    </section>
    <section anchor="theory-of-operations">
      <name>Theory of Operations</name>
      <section anchor="connection-establishment">
        <name>Connection establishment</name>
        <t>In order to live stream using RUSH, the client establishes a QUIC connection
using the ALPN token "rush".</t>
        <t>After the QUIC connection is established, client creates a new bidirectional
QUIC stream, choses starting frame ID and sends <tt>Connect</tt> frame
<xref target="connect-frame"/> over that stream.  This stream is called the Connect Stream.</t>
        <t>The client sends <tt>mode of operation</tt> setting in <tt>Connect</tt> frame <xref target="connect-frame"/> payload.</t>
        <t>One connection SHOULD only be used to send one media stream, for now 1 video and 1 audio track are supported. In the future we could send multiple tracks per stream.</t>
      </section>
      <section anchor="sending-video-data">
        <name>Sending Video Data</name>
        <t>The client can choose to wait for the <tt>ConnectAck</tt> frame <xref target="connect-ack-frame"/>
or it can start optimistically sending data immediately after sending the <tt>Connect</tt> frame.</t>
        <t>A track is a logical organization of the data, for example, video can have one
video track, and two audio tracks (for two languages). The client can send data
for multiple tracks simultaneously.</t>
        <t>The encoded audio or video data of each track is serialized into frames (see
<xref target="audio-frame"/> or <xref target="video-frame"/>) and transmitted from the client to the
server.  Each track has its own monotonically increasing frame ID sequence. The
client MUST start with initial frame ID = 1.</t>
        <t>Depending on mode of operation (<xref target="quic-mapping"/>), the client sends audio and
video frames on the Connect stream or on a new QUIC stream for each frame.</t>
        <t>In <tt>Multi Stream Mode</tt> (<xref target="multi-stream-mode"/>), the client can stop sending a
frame by resetting the corresponding QUIC stream. In this case, there is no
guarantee that the frame was received by the server.</t>
      </section>
      <section anchor="receiving-data">
        <name>Receiving data</name>
        <t>Upon receiving <tt>Connect</tt> frame <xref target="connect-frame"/>, if the server accepts the stream, the server will reply with <tt>ConnectAck</tt> frame <xref target="connect-ack-frame"/> and it will prepare to receive audio/video data.</t>
        <t>It's possible that in <tt>Multi Stream Mode</tt> (<xref target="multi-stream-mode"/>), the server
receives audio or video data before it receives the <tt>Connect</tt> frame <xref target="connect-frame"/>.  The
implementation can choose whether to buffer or drop the data.
The audio/video data cannot be interpreted correctly before the arrival of the <tt>Connect</tt> frame <xref target="connect-frame"/>.</t>
        <t>In <tt>Single Stream Mode</tt> (<xref target="single-stream-mode"/>), it is guaranteed by the transport that
frames arrive into the application layer in order they were sent.</t>
        <t>In <tt>Multi Stream Mode</tt>, it's possible that frames arrive at the application
layer in a different order than they were sent, therefore the server MUST keep
track of last received frame ID for every track that it receives. A gap in the
frame sequence ID on a given track can indicate out of order delivery and the
server MAY wait until missing frames arrive. The server must consider frame lost
if the corresponding QUIC stream was reset.</t>
        <t>Upon detecting a gap in the frame sequence, the server MAY wait for the missing
frames to arrive for an implementation defined time. If missing frames don't
arrive, the server SHOULD consider them lost and continue processing rest of the
frames. For example if the server receives the following frames for track 1: <tt>1
2 3 5 6</tt> and frame <tt>#4</tt> hasn't arrived after implementation defined timeout,
thee server SHOULD continue processing frames <tt>5</tt> and <tt>6</tt>.</t>
        <t>It is worth highlighting that in multi stream mode there is a need for a de-jitter function (that introduces latency). Also the subsequent processing pipeline should tolerate lost frames, so "holes" in the audio / video streams.</t>
        <t>When the client is done streaming, it sends the <tt>End of Video</tt> frame
(<xref target="end-of-video-frame"/>) to indicate to the server that there won't be any more
data sent.</t>
      </section>
      <section anchor="reconnect">
        <name>Reconnect</name>
        <t>If the QUIC connection is closed at any point, client MAY reconnect by simply
repeat the <tt>Connection establishment</tt> process (<xref target="connection-establishment"/>) and
resume sending the same video where it left off.  In order to support
termination of the new connection by a different server, the client SHOULD
resume sending video frames starting with I-frame, to guarantee that the video
track can be decoded from the 1st frame sent.</t>
        <t>Reconnect can be initiated by the server if it needs to "go away" for
maintenance. In this case, the server sends a <tt>GOAWAY</tt> frame (<xref target="goaway-frame"/>)
to advise the client to gracefully close the connection.  This allows client to
finish sending some data and establish new connection to continue sending
without interruption.</t>
      </section>
    </section>
    <section anchor="wire-format">
      <name>Wire Format</name>
      <section anchor="frame-header">
        <name>Frame Header</name>
        <t>The client and server exchange information using frames. There are different
types of frames and the payload of each frame depends on its type.</t>
        <t>The bytes in the wire are in <strong>big endian</strong></t>
        <t>Generic frame format:</t>
        <artwork><![CDATA[
0       1       2       3       4       5       6       7
+--------------------------------------------------------------+
|                       Length (64)                            |
+--------------------------------------------------------------+
|                       ID (64)                                |
+-------+------------------------------------------------------+
|Type(8)| Payload ...                                          |
+-------+------------------------------------------------------+
]]></artwork>
        <dl>
          <dt>Length(64)`:</dt>
          <dd>
            <t>Each frame starts with length field, 64 bit size that tells size of the frame
in bytes (including predefined fields, so if LENGTH is 100 bytes, then PAYLOAD
length is 100 - 8 - 8 - 1 = 82 bytes).</t>
          </dd>
          <dt>ID(64):</dt>
          <dd>
            <t>64 bit frame sequence number, every new frame MUST have a sequence ID greater
than that of the previous frame within the same track.  Track ID would be
specified in each frame. If track ID is not specified it's 0 implicitly.</t>
          </dd>
          <dt>Type(8):</dt>
          <dd>
            <t>1 byte representing the type of the frame.</t>
          </dd>
        </dl>
        <t>Predefined frame types:</t>
        <table>
          <thead>
            <tr>
              <th align="left">Frame Type</th>
              <th align="left">Frame</th>
            </tr>
          </thead>
          <tbody>
            <tr>
              <td align="left">0x0</td>
              <td align="left">connect frame</td>
            </tr>
            <tr>
              <td align="left">0x1</td>
              <td align="left">connect ack frame</td>
            </tr>
            <tr>
              <td align="left">0x2</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0x3</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0x4</td>
              <td align="left">end of video frame</td>
            </tr>
            <tr>
              <td align="left">0x5</td>
              <td align="left">error frame</td>
            </tr>
            <tr>
              <td align="left">0x6</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0x7</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0x8</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0x9</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0xA</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0XB</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0xC</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0xD</td>
              <td align="left">video frame</td>
            </tr>
            <tr>
              <td align="left">0xE</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0XF</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0X10</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0x11</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0x12</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0x13</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0x14</td>
              <td align="left">audio frame</td>
            </tr>
            <tr>
              <td align="left">0x15</td>
              <td align="left">GOAWAY frame</td>
            </tr>
            <tr>
              <td align="left">0x16</td>
              <td align="left">Timed metadata</td>
            </tr>
          </tbody>
        </table>
      </section>
      <section anchor="frames">
        <name>Frames</name>
        <section anchor="connect-frame">
          <name>Connect frame</name>
          <artwork><![CDATA[
+--------------------------------------------------------------+
|                       Length (64)                            |
+--------------------------------------------------------------+
|                       ID (64)                                |
+-------+-------+---------------+---------------+--------------+
| 0x0   |Version|Video Timescale|Audio Timescale|              |
+-------+-------+---------------+---------------+--------------+
|                    Live Session ID(64)                       |
+--------------------------------------------------------------+
| Payload ...                                                  |
+--------------------------------------------------------------+
]]></artwork>
          <dl>
            <dt>Version (unsigned 8bits):</dt>
            <dd>
              <t>version of the protocol (initial version is 0x0).</t>
            </dd>
            <dt>Video Timescale(unsigned 16bits):</dt>
            <dd>
              <t>timescale for all video frame timestamps on this connection. For instance 25</t>
            </dd>
            <dt>Audio Timescale(unsigned 16bits):</dt>
            <dd>
              <t>timescale for all audio samples timestamps on this connection, recommended
value same as audio sample rate, for example 44100</t>
            </dd>
            <dt>Live Session ID(unsigned 64bits):</dt>
            <dd>
              <t>identifier of broadcast, when reconnect, client MUST use the same live session
ID</t>
            </dd>
            <dt>Payload:</dt>
            <dd>
              <t>application and version specific data that can be used by the server. OPTIONAL
A possible implementation for this could be to add in the payload a UTF-8 encoded JSON data that specifies some parameters that server needs to authenticate / validate that connection, for instance:
~~~
payloadBytes = strToJSonUtf8('{"url": "/rtmp/BID?s_bl=1&amp;s_l=3&amp;s_sc=VALID&amp;s_sw=0&amp;s_vt=usr_dev&amp;a=TOKEN"}')
~~~</t>
            </dd>
          </dl>
          <t>This frame is used by the client to initiate broadcasting. The client can start
sending other frames immediately after Connect frame <xref target="connect-frame"/> without waiting
acknowledgement from the server.</t>
          <t>If server doesn't support VERSION sent by the client, the server sends an Error
frame <xref target="error-frame"/> with code <tt>UNSUPPORTED VERSION</tt>.</t>
          <t>If audio timescale or video timescale are 0, the server sends error frame <xref target="error-frame"/> with
error code <tt>INVALID FRAME FORMAT</tt> and closes connection.</t>
          <t>If the client receives a Connect frame from the server, the client sends an
Error frame <xref target="error-frame"/> with code <tt>TBD</tt>.</t>
        </section>
        <section anchor="connect-ack-frame">
          <name>Connect Ack frame</name>
          <artwork><![CDATA[
0       1       2       3       4       5       6       7
+--------------------------------------------------------------+
|                       Length (64) = 17                       |
+--------------------------------------------------------------+
|                       ID (64)                                |
+-------+------------------------------------------------------+
| 0x1   |
+-------+
]]></artwork>
          <t>The server sends the "Connect Ack" frame in response to "Connect" <xref target="connect-frame"/> frame
indicating that server accepts "version" and the stream is authenticated / validated (optional), so it is ready to receive data.</t>
          <t>If the client doesn't receive "Connect Ack" frame from the server within a
timeout, it will close the connection.  The timeout value is chosen by the
implementation.</t>
          <t>There can be only one "Connect Ack" frame sent over lifetime of the QUIC
connection.</t>
          <t>If the server receives a Connect Ack frame from the client, the client sends an
Error frame with code <tt>TBD</tt>.</t>
        </section>
        <section anchor="end-of-video-frame">
          <name>End of Video frame</name>
          <artwork><![CDATA[
+--------------------------------------------------------------+
|                       Length (64) = 17                       |
+--------------------------------------------------------------+
|                       ID (64)                                |
+-------+------------------------------------------------------+
| 0x4   |
+-------+
]]></artwork>
          <t>End of Video frame is sent by a client when it's done sending data and is about
to close the connection. The server SHOULD ignore all frames sent after that.</t>
        </section>
        <section anchor="error-frame">
          <name>Error frame</name>
          <artwork><![CDATA[
+--------------------------------------------------------------+
|                       Length (64) = 29                       |
+--------------------------------------------------------------+
|                       ID (64)                                |
+-------+------------------------------------------------------+
| 0x5   |
+-------+------------------------------------------------------+
|                   Sequence ID (64)                           |
+------------------------------+-------------------------------+
|      Error Code (32)         |
+------------------------------+
]]></artwork>
          <dl>
            <dt>Sequence ID(unsigned 64bits):</dt>
            <dd>
              <t>ID of the frame sent by the client that error is generated for, ID=0x0
indicates connection level error.</t>
            </dd>
            <dt>Error Code(unsigned 32bits):</dt>
            <dd>
              <t>Indicates the error code</t>
            </dd>
          </dl>
          <t>Error frame can be sent by the client or the server to indicate that an error
occurred.</t>
          <t>Some errors are fatal and the connection will be closed after sending the Error
frame.</t>
          <t>See section <xref target="connection-errors"/> and <xref target="frame-errors"/> for more information about error codes</t>
        </section>
        <section anchor="video-frame">
          <name>Video frame</name>
          <artwork><![CDATA[
+--------------------------------------------------------------+
|                       Length (64)                            |
+--------------------------------------------------------------+
|                       ID (64)                                |
+-------+-------+----------------------------------------------+
|  0xD  | Codec |
+-------+-------+----------------------------------------------+
|                        PTS (64)                              |
+--------------------------------------------------------------+
|                        DTS (64)                              |
+-------+------------------------------------------------------+
|TrackID|                                                      |
+-------+-------+----------------------------------------------+
| I Offset      | Video Data ...                               |
+---------------+----------------------------------------------+
]]></artwork>
          <dl>
            <dt>Codec (unsigned 8bits):</dt>
            <dd>
              <t>specifies codec that was used to encode this frame.</t>
            </dd>
            <dt>PTS (signed 64bits):</dt>
            <dd>
              <t>presentation timestamp in connection video timescale</t>
            </dd>
            <dt>DTS (signed 64bits):</dt>
            <dd>
              <t>decoding timestamp in connection video timescale</t>
            </dd>
          </dl>
          <t>Supported type of codecs:</t>
          <table>
            <thead>
              <tr>
                <th align="left">Type</th>
                <th align="left">Codec</th>
              </tr>
            </thead>
            <tbody>
              <tr>
                <td align="left">0x1</td>
                <td align="left">H264</td>
              </tr>
              <tr>
                <td align="left">0x2</td>
                <td align="left">H265</td>
              </tr>
              <tr>
                <td align="left">0x3</td>
                <td align="left">VP8</td>
              </tr>
              <tr>
                <td align="left">0x4</td>
                <td align="left">VP9</td>
              </tr>
            </tbody>
          </table>
          <dl>
            <dt>Track ID (unsigned 8bits):</dt>
            <dd>
              <t>ID of the track that this frame is on</t>
            </dd>
            <dt>I Offset (unsigned 16bits):</dt>
            <dd>
              <t>Distance from sequence ID of the I-frame that is required before this frame
can be decoded. This can be useful to decide if frame can be dropped.</t>
            </dd>
            <dt>Video Data:</dt>
            <dd>
              <t>variable length field, that carries actual video frame data that is codec
dependent</t>
            </dd>
          </dl>
          <t>For h264/h265 codec, "Video Data" are 1 or more NALUs in AVCC format (4 bytes size header):</t>
          <artwork><![CDATA[
0       1       2       3       4       5       6       7
+--------------------------------------------------------------+
|                    NALU Length (64)                          |
+--------------------------------------------------------------+
|                    NALU Data ...
+--------------------------------------------------------------+
]]></artwork>
          <t>EVERY h264 video key-frame MUST start with SPS/PPS NALUs.
EVERY h265 video key-frame MUST start with VPS/SPS/PPS NALUs.</t>
          <t>Binary concatenation of "video data" from consecutive video frames, without data
loss MUST produce VALID h264/h265 bitstream.</t>
        </section>
        <section anchor="audio-frame">
          <name>Audio frame</name>
          <artwork><![CDATA[
+--------------------------------------------------------------+
|                       Length (64)                            |
+--------------------------------------------------------------+
|                       ID (64)                                |
+-------+------------------------------------------------------+
| 0x14  | Codec |
+-------+-------+----------------------------------------------+
|                      Timestamp (64)                          |
+-------+-------+-------+--------------------------------------+
|TrackID|   Header Len  |
+-------+-------+-------+--------------------------------------+
| Header + Audio Data ...
+--------------------------------------------------------------+
]]></artwork>
          <dl>
            <dt>Codec (unsigned 8bits):</dt>
            <dd>
              <t>specifies codec that was used to encode this frame.</t>
            </dd>
          </dl>
          <t>Supported type of codecs:</t>
          <table>
            <thead>
              <tr>
                <th align="left">Type</th>
                <th align="left">Codec</th>
              </tr>
            </thead>
            <tbody>
              <tr>
                <td align="left">0x1</td>
                <td align="left">AAC</td>
              </tr>
              <tr>
                <td align="left">0x2</td>
                <td align="left">OPUS</td>
              </tr>
            </tbody>
          </table>
          <dl>
            <dt>Timestamp (signed 64bits):</dt>
            <dd>
              <t>timestamp of first audio sample in Audio Data.</t>
            </dd>
            <dt>Track ID (unsigned 8bits):</dt>
            <dd>
              <t>ID of the track that this frame is on</t>
            </dd>
            <dt>Header Len (unsigned 16bits):</dt>
            <dd>
              <t>Length in bytes of the audio header contained in the first portion of the payload</t>
            </dd>
            <dt>Audio Data (variable length field):</dt>
            <dd>
              <t>it carries the audio header and 1 or more audio frames that are codec dependent.</t>
            </dd>
          </dl>
          <t>For AAC codec:
- "Audio Data" are 1 or more AAC samples, prefixed with Audio Specific Config (ASC) header defined in <tt>ISO 14496-3</tt>
- Binary concatenation of all AAC samples in "Audio Data" from consecutive audio frames, without data loss MUST produce VALID AAC bitstream.</t>
          <t>For OPUS codec:
- "Audio Data" are 1 or more OPUS samples, prefixed with OPUS header as defined in <xref target="RFC7845"/></t>
        </section>
        <section anchor="goaway-frame">
          <name>GOAWAY frame</name>
          <artwork><![CDATA[
0       1       2       3       4       5       6       7
+--------------------------------------------------------------+
|                          17                                  |
+--------------------------------------------------------------+
|                       ID (64)                                |
+-------+------------------------------------------------------+
| 0x15  |
+-------+
]]></artwork>
          <t>The GOAWAY frame is used by the server to initiate graceful shutdown of a connection, for example, for server maintenance.</t>
          <t>Upon receiving GOAWAY frame, the client MUST send frames remaining in current GOP and
stop sending new frames on this connection. The client SHOULD establish a new connection and resume sending frames there, so when resume video frame will start with an IDR frame.</t>
          <t>After sending a GOAWAY frame, the server continues processing arriving frames
for an implementation defined time, after which the server SHOULD close
the connection.</t>
        </section>
        <section anchor="timedmetadata-frame">
          <name>TimedMetadata frame</name>
          <artwork><![CDATA[
+--------------------------------------------------------------+
|                       Length (64)                            |
+--------------------------------------------------------------+
|                       ID (64)                                |
+-------+------------------------------------------------------+
| 0x16  |TrackID|
+-------+-------+----------------------------------------------+
|                      Topic (64)                              |
+--------------------------------------------------------------+
|                      EventMessage (64)                       |
+-------+------------------------------------------------------+
|                      Timestamp (64)                          |
+-------+------------------------------------------------------+
|                      Duration (64)                           |
+-------+------------------------------------------------------+
| Payload ...
+--------------------------------------------------------------+
]]></artwork>
          <dl>
            <dt>Track ID (unsigned 8bits):</dt>
            <dd>
              <t>ID of the track that this frame is on</t>
            </dd>
            <dt>Timestamp (signed 64bits):</dt>
            <dd>
              <t>PTS of the event</t>
            </dd>
            <dt>Topic (unsigned 64bits):</dt>
            <dd>
              <t>A unique identifier of the app level feature. May be used to decode payload or do other application specific processing</t>
            </dd>
            <dt>EventMessage (unsigned 64bits):</dt>
            <dd>
              <t>A unique identifier of the event message used for app level events deduplication</t>
            </dd>
            <dt>Duration (unsigned 64bits):</dt>
            <dd>
              <t>duration of the event in video PTS timescale. Can be 0.</t>
            </dd>
            <dt>Payload:</dt>
            <dd>
              <t>variable length field. May be used by the app to send additional event metadata. UTF-8 JSON recommended</t>
            </dd>
          </dl>
        </section>
      </section>
      <section anchor="quic-mapping">
        <name>QUIC Mapping</name>
        <t>One of the main goals of the RUSH protocol was ability to provide applications a
way to control reliability of delivering audio/video data. This is achieved by
using a special mode <xref target="multi-stream-mode"/>.</t>
        <section anchor="single-stream-mode">
          <name>Single Stream Mode</name>
          <t>In single stream mode, RUSH uses one bidirectional QUIC stream to send data and receive
data.  Using one stream guarantees reliable, in-order delivery - applications
can rely on QUIC transport layer to retransmit lost packets.  The performance
characteristics of this mode are similar to RTMP over TCP.</t>
        </section>
        <section anchor="multi-stream-mode">
          <name>Multi Stream Mode</name>
          <t>In single stream mode <xref target="single-stream-mode"/>, if packet belonging to video frame is lost, all packets sent
after it will not be delivered to application, even though those packets may
have arrived at the server. This introduces head of line blocking and can
negatively impact latency.</t>
          <t>To address this problem, RUSH defines "Multi Stream Mode", in which one QUIC
stream is used per audio/video frame.</t>
          <t>Connection establishment follows the normal procedure by client sending Connect
frame, after that Video and Audio frames are sent using following rules:</t>
          <ul spacing="normal">
            <li>
              <t>Each new frame is sent on new bidirectional QUIC stream</t>
            </li>
            <li>
              <t>Frames within same track must have IDs that are monotonically increasing,
such that ID(n) = ID(n-1) + 1</t>
            </li>
          </ul>
          <t>The receiver reconstructs the track using the frames IDs.</t>
          <t>Response Frames (Connect Ack<xref target="connect-ack-frame"/> and Error<xref target="error-frame"/>), will be in the response stream of the
stream that sent it.</t>
          <t>The client MAY control delivery reliability by setting a delivery timer for
every audio or video frame and reset the QUIC stream when the timer fires.  This
will effectively stop retransmissions if the frame wasn't fully delivered in
time.</t>
          <t>Timeout is implementation defined, however future versions of the draft will
define a way to negotiate it.</t>
        </section>
      </section>
    </section>
    <section anchor="error-handling">
      <name>Error Handling</name>
      <t>An endpoint that detects an error SHOULD signal the existence of that error to
its peer.  Errors can affect an entire connection (see <xref target="connection-errors"/>),
or a single frame (see <xref target="frame-errors"/>).</t>
      <t>The most appropriate error code SHOULD be included in the error frame that
signals the error.</t>
      <section anchor="connection-errors">
        <name>Connection Errors</name>
        <t>Affects the the whole connection:</t>
        <t>1 - UNSUPPORTED VERSION - indicates that the server doesn't support version
specified in Connect frame
4- CONNECTION_REJECTED - Indicates the server can not process that connection for any reason</t>
      </section>
      <section anchor="frame-errors">
        <name>Frame errors</name>
        <t>There are two error codes defined in core protocol that indicate a problem with
a particular frame:</t>
        <t>2 - UNSUPPORTED CODEC - indicates that the server doesn't support the given
audio or video codec</t>
        <t>3 - INVALID FRAME FORMAT - indicates that the receiver was not able to parse
the frame or there was an issue with a field's value.</t>
      </section>
    </section>
    <section anchor="extensions">
      <name>Extensions</name>
      <t>RUSH permits extension of the protocol.</t>
      <t>Extensions are permitted to use new frame types (<xref target="wire-format"/>), new error
codes (<xref target="error-frame"/>), or new audio and video codecs (<xref target="audio-frame"/>,
<xref target="video-frame"/>).</t>
      <t>Implementations MUST ignore unknown or unsupported values in all extensible
protocol elements, except <tt>codec id</tt>, which returns an UNSUPPORTED CODEC error.
Implementations MUST discard frames that have unknown or unsupported types.</t>
    </section>
    <section anchor="security-considerations">
      <name>Security Considerations</name>
      <t>RUSH protocol relies on security guarantees provided by the transport.</t>
      <t>Implementation SHOULD be prepared to handle cases when sender deliberately sends
frames with gaps in sequence IDs.</t>
      <t>Implementation SHOULD be prepare to handle cases when server never receives
Connect frame (<xref target="connect-frame"/>).</t>
      <t>A frame parser MUST ensure that value of frame length field (see
<xref target="frame-header"/>) matches actual length of the frame, including the frame
header.</t>
      <t>Implementation SHOULD be prepare to handle cases when sender sends a frame with
large frame length field value.</t>
    </section>
    <section anchor="iana-considerations">
      <name>IANA Considerations</name>
      <t>TODO: add frame type registry, error code registry, audio/video codecs
registry</t>
    </section>
  </middle>
  <back>
    <references anchor="sec-normative-references">
      <name>Normative References</name>
      <reference anchor="RFC2119">
        <front>
          <title>Key words for use in RFCs to Indicate Requirement Levels</title>
          <author fullname="S. Bradner" initials="S." surname="Bradner"/>
          <date month="March" year="1997"/>
          <abstract>
            <t>In many standards track documents several words are used to signify the requirements in the specification. These words are often capitalized. This document defines these words as they should be interpreted in IETF documents. This document specifies an Internet Best Current Practices for the Internet Community, and requests discussion and suggestions for improvements.</t>
          </abstract>
        </front>
        <seriesInfo name="BCP" value="14"/>
        <seriesInfo name="RFC" value="2119"/>
        <seriesInfo name="DOI" value="10.17487/RFC2119"/>
      </reference>
      <reference anchor="RFC8174">
        <front>
          <title>Ambiguity of Uppercase vs Lowercase in RFC 2119 Key Words</title>
          <author fullname="B. Leiba" initials="B." surname="Leiba"/>
          <date month="May" year="2017"/>
          <abstract>
            <t>RFC 2119 specifies common key words that may be used in protocol specifications. This document aims to reduce the ambiguity by clarifying that only UPPERCASE usage of the key words have the defined special meanings.</t>
          </abstract>
        </front>
        <seriesInfo name="BCP" value="14"/>
        <seriesInfo name="RFC" value="8174"/>
        <seriesInfo name="DOI" value="10.17487/RFC8174"/>
      </reference>
      <reference anchor="RFC7845">
        <front>
          <title>Ogg Encapsulation for the Opus Audio Codec</title>
          <author fullname="T. Terriberry" initials="T." surname="Terriberry"/>
          <author fullname="R. Lee" initials="R." surname="Lee"/>
          <author fullname="R. Giles" initials="R." surname="Giles"/>
          <date month="April" year="2016"/>
          <abstract>
            <t>This document defines the Ogg encapsulation for the Opus interactive speech and audio codec. This allows data encoded in the Opus format to be stored in an Ogg logical bitstream.</t>
          </abstract>
        </front>
        <seriesInfo name="RFC" value="7845"/>
        <seriesInfo name="DOI" value="10.17487/RFC7845"/>
      </reference>
    </references>
    <?line 681?>

<section numbered="false" anchor="acknowledgments">
      <name>Acknowledgments</name>
      <t>This draft is the work of many people: Vlad Shubin, Nitin Garg, Milen Lazarov,
Benny Luo, Nick Ruff, Konstantin Tsoy, Nick Wu.</t>
    </section>
  </back>
  <!-- ##markdown-source:
H4sIALS5BmgAA+08a3PbRpLf51fMMVUbySFpUZYdW1WuO1qSbTnW40TZvtTV
VTQkhiTWIMDgIYWxnd++/RpgAEKWH3J27y6uSiQBmJmenn5Pd/d6PZWHeWR3
defs1ei57ukzG4VmHFm9UcSp/L6pszy1ZhHGM71MkzyZJFFHmfE4tZcwMi2y
eUcFySQ2C5gpSM00771dFrMw7uG7XmRym+VqAj9mSbra1WE8TZQKl+muzuGL
fHtr69HWtjKwyK5+ZmObmkhdJenbWZoUy119frJ/ot/A3wjBM3ym3toVfBDs
6sM4t2ls894+rqtUlps4+MVESQywrGymsoVJ819+LRIAYlfHiVqGu/q/YRNd
nSUpbGyawW+rBf7yP0qZIp8n6a7SPaXhXxjDoJ/6+hS3Q094lz+FaRhF3uMk
nZk4/N3kYRLv6iObG3psFyaMYMdvw/Q/FvCwP0kWtbmP+/qZSWfe1MdhHsbV
w49PHM/gu/aZh339NA3jwEaRN/swMnH9+ccXMFP8tn2FF329Z+PfDSC1WuAF
HEtYe/7xBf4+4U+vXeKNDTM4RB/7L8xbW39+wxrw/ZV8Xq2j4iRdwIhLC8et
z57ubQ8Gj3aBMIE8yxfwr9fraTMGHjATIDBilDDTgEazXEbhhNbsRfbSRiV7
aJgBNjADukeajWAqfRkGNumr8zkMBm4pFjbOdWCzSRqObabzua2GAw3reXKl
w1wvzDLTSZwn+j9fHe71GZxFGASRVeo7pP80CYoJAuEBp8dhEKaWHpvIh1Q3
IAUIwllsAwK5AlQJ9DAgn5tcp0WMYOg8Wepk6mCh9a5MpsdFGOXa4MqpXUZm
Yml7OOfZ+dGp3jizJuqdhwsLh5NlZoZoORUYNvVVmM8RA2qWALSwVwAPwdBZ
sVwCl9JEsb3SpgjChNBDUOpJEtgJMLD9LbdxFo7DKMxXgHmaDI8RgcWBC1rV
6ny1tPA9zrAoojzs4bG+dev0AZ/aBEGIG+9q2t4McJIpD4OAhyUjBtdH/AMW
TW4AlYi/dKVnhUkNCCY41vFKFzlA9TtuGLGmWJhmgL2PkQIt7Q6pCwhKrWay
ZOBxKqSNJczbR0LYS+JLmIbgww/27TSMaR8ZrmQ1iEyNMjPTnaNXo/NOl3/q
4xP6/ewApjw72MffR8+HL1+Wv7gvRs9PXr2E90p+q0bunRwdHRzv82B4qhuP
joY/dxjszsnp+eHJ8fBlhw8pzFSJAJD/iNKxhVcg1JepzYEsgaYcZgIc82Tv
VA929Lt3wrAfPsDv/wZ/PBz8uPPhg7qa25jXSuIIdkx/AkZXyATWIFtqA6J7
YpZhbiKkhUxnwGyxntvUAi6fpiBk7jKZogTY1VEyg7OPdAEIRYIqJYTjjkkU
0g5g1cymQAMwfQxEOZkbYCOlTs9HNBHsKYMPZSRwA6irxRJkzL58ANScBEgq
3svhcI/emeDSxBNECTEBkb5SgMtX9BqUIOrMUlTpyKwAEIRZqdenPD9zzdLg
FgHHAG2uN55vP7gvb7KlnYTTcKJx2k2lRjIus78WFhZvGbpz9yPjT2X8Mpzk
RfrZw4eAFzgWE9iUprkzpK3vI7edA4tlJBxGxFH6OX13B0aNGGH8cTkn8Oo0
nCn17OSUXpMlgccpwKEdwN8KDwKzWKKXq3k4mSNVpqZHh0wE2pviXjKkW7Bd
UjzpAMgHeBHYDQwdnPpkCcaMMOF3xKQxC2WNpzuOwmyOtK8UyB1eDjiApDCL
CV1kSA0oDYiKHaWVo3F9FgaTcm7Fg/Dz4cvTY5jyrY3FUAMAh1PEP75tjEPN
UU0cdN1iE4Akp4VQktY0i6IpGFb4fp5k8B1MkZLeIwTpw31hjBhkz4Wg4IJf
qnfvZH3GJnBzcknQAVfxtH2tSVAKQuA3YMUI2AB3ILMJCfRZ0gnYsuAC+ASP
InFHcYG0R/DB0Tbg0evwLM0qSgye7ElsfWyJFCQxA0KryBCmhJaFhxZUThCa
EjekwUClD4TUESUDYWXWQSj/RA8BIaEmwh1OC2KcK1y6iBiNrLqWYKbTyEwv
kaEcCoDORvARbvA1LYXsUsMMCic4KzgsBPjKhKxgcTmHj+Hk7TpKYC2HFoUG
Ds9Ex006cRGCxYCnsyIwEQJSjOGCcJFbeGGmzP382l9S1kMKFYyQJeOEr2/i
4XniUJydUWt/A1kZ2a6zCwCuuQE2goNQ/IimZM0AgtLHfKY3aPvwFKxjUN5g
+Wz2dQNhhHhcEO2KtRPIQnxiYpsUWbQSQgSJCcTnBDaMYkgIJ7ADa0CqlDsF
xREaMBRIzcGxiHTZyCxyCU1R8UgKp0KTuUebvDGUiYswR8U5TZOFLzJgSjSK
WD8BUx1Uq89BBYY52DWgBBdJDEZHLMcYxsj8WY2ZnS4gDCmZnUwJpgQy5sj2
gFMrRz3WA8DKvl3KySe4VIMz9ca7d78W4aQnhg1sqyb1mKNLG1AOVhCVxDWJ
IOICMAUvWHJ5soppBjHgaA7Y7eIID9WpkyOA7gIhYjuRx/UQ5iZYzARgGDuq
BhKhbYPxhwqfhQ0NSFJ4sEz4Mw8eYXcSbpml2YHp4U/woUprkqUiCQWaHg1v
kMQW1EWAa+EbOV6SAmf0znGhUq9gYRmAD28UfV0dTr1JtZlM7DJn3ejEmvf6
Cj1itP5XTAKfKkpYo+Y8HiykpRiCsjU+77sV6+Bh5d+D0EsysPgjQUv4BefH
gCtZKGvl07EFUrEIX/lZi9BaRx6pLatClEqL0ubzJC9YpnjKZPIW0yn8BgsH
KZCRk2x9EiLN7eMcwKJNO5lIa5KTLiKIcRYwS8JLlJ3TTwSa+WAE5AF4bSIy
o8drmATUAKGWVFqSYl4aaHhAqrSWUjxUEnEEou+aksUalqYQWu1XyAhoNF/L
owjBGjnUVxO28dZS5Vrgt4WIfuRkt66JG4sLR5aIFZInqffW2qViQQp4jkyW
V1xZij+SN+Qc8pdMsxVR9fVQz8yydF1pXGl1H+6zFENfNJYZkJRCECQYV9NJ
QY4Jw1/6oaQSSpmvwQ1jZV+Anxhp0NaVYHeoYrUnAxYF7AVIJAtxWoYpSrJc
iVy4VpyJaALR1xexEwCRTkgOGm+fur7PmjgpoXWmicDrCAnoR04XP0Bk1Fkt
QAcYTTLwpEC8Tpv7DZL4+1zxFLWFxa4r9w2vFrRtwid6/GFcULBmYnlG2Gku
PCbQ9fXTyihpiNGaHJkmUZRceWDRbumAB7v6YqC29T19Xz+4oMUZXRff7Vyg
zgb4BQWBmFUfQQFQSFfBii3bXNuPgHJxn1e9eHBBMhfZHFxMEO3zcDaP4D9R
bCx+Sc46AiDdXmox1L8SYsIwSe/vaKIARRUxm9IbMgfHsmBpjBnHkxVYYcMo
Y0GRFWOmk9wHdRkugdrB2gYXHs3jPInQmGA6lY1gpFd35vAm6zjCY1F/1/md
ZUzmzdzGvm6nCE1sqxA4yTs2REikHqC1P2VL2zk1ICvhg14y7TVsNKDZkmVF
/slpONWOpj4SJop3E68Aj+BdktgXGch6ncU2HMr0OkduAghAushpmmUSohBz
xhqwVuomQXGdIeGsQBEurYjKi+tc1QuHfdQI1ZK92kdij8KEWUH8XVn7GVIw
Y/2KySPXkZ0i/0xBZ/p+sDhDCkgFMF8z/NGc8/YLW/DFOOO0ZqMxtTcBqhmQ
pdtK1sshn1oXAWmxwDhIWkliOC6K3fiG98BRoDu78uDcELaS86b5hgID0II8
Q3KuMwNZd2VWHeQgtTCo92NDJvia2eimEFtZXzw7Gb4Z/uxUPhzaLMG5SqJU
KEiDyzCzDW9hBpuz0wLdACImEfkO6c4vNyjAsmqcwrhjNi9xnCULNmdIlpRk
0jxCCaaSKJKhCg8CNRvZOWlBUVcKdr7BcOhTisERR1DITgJANU/Xi8i5aFwt
eld48o5UX2rJDy9pSVHAGMnOqUnWqS4sUPpxjN6A/BtyRtChwsHiC45XGEIR
6UPhXFwH/r5zZxzONO7YxHfuKEUXYOFEJmRQd5X6448/1JbmfwP5uS0/78nP
Hfl5X34+kJ8/qh96X/XvB/Vet/97aeMZcMvGg53Na76gf++/HQRgG920eh2C
L4QEIDiH49x4uPlen8rh9/v9m9a9VQiQCBSjHPd8sat22Y8XMYMSLGP5FfHB
TEMbBV39YEePUW+FvzsZZqMo479FpLLqCmOh1A1w/aMi4Etf6ywJmo4VKsio
lwfHz87pwmmwtcXjSAbF+nT488uT4b4SKOSLnn4o/w30Y/1wm4dsonGxj/vB
7QikDQM4LhZjlOhsQqPo4A/IAKdAj6lZyzMKWaZKTHnjzDPcy2WYFJlzoQFV
wpOkmEigo2QjwQ4TXZFVMQYbWkLDdAnhRQ7Qsszd1+Sw59r7Fj2TLbLLwkmY
c2SIqQg3OyAMoNPMFwNOSaLcqJ0LDDv1ToFgJ8kEkuG9iD+cV7s/3qv3Pum4
P+Cx3vptCz5zmojm4scD7zFuaOpmgnfb8A5hTNHY5Ef31h/twCPLBpGnWuXl
fXyZpklae/pgfZYf1x89XH/0aP3RcO3Rfz1Z/2pv/dE+PFoH+GB9uqfrjwZb
6/MNBi3PWlA4aMHhAJHI5qkPzADRx7q8/hwRiBergcabbVK07yulSPcO5cWD
MDnJkb9UwrpAbkJ0w98/CDfBVK9tmoFB8Z4j7nge2cRE9j3fQVV/fwMI2o4A
3eIRukhgiLBs/Zan8EXq8BYhILUoJ4CZS5LR8BAUSUZy9lLelUpAkh82XJja
fQACHA4UNVLjIKtZBw/KaXP3lj3bKKoJkfIKV2LTaKV71vNTShHBfCVQWdv3
lWqQyieuyJIio0BD9vE1u+T0LcBBAzdFXZqoEK1nsto0Gh3o2rWK3tkB/Q2m
R4OuShAf7JQgAgZAkYH2SxHd4xQIA5wTcD7xKr5yOytnFFV4Ie4FgcP3n7wK
WAag+pi+cHY/XEhpIHJw5S0vCUC+lGcfi+7l6tFx7bIQ1LAKGzZiJxx1Ivyx
BUABpyBwJryz/41+df6097C863kxOjn2gKhulMkPKu+/M3nNvknp6WH6GaKP
IgR3NZxRGFCwgDbkneTUo55dYgAB6AlZb48xYHGevBgl8at8+nDj+3edIo06
u7pzN80Xy7tPDvf/PftlHD0e/C37JXp8D/6fTR6/Hr483Mdfrx5vwY/L/HGR
pb8E9vJv5vH5yU8Hx50P328yt5Hrx3QOv/g4rhxI591WRIDJKmv3ami0Kuct
JhQTF19r/d6wpsdaLmudx4iBQ/QgwYyJk6vIBjNJSHK+eXlPAuabHEKQWAqp
uZyj1wdnI6ASct7rm2tztGN9gLaNcpCRpVODizI29MWr49Gr09OTs/ODfbfE
BcMhV5Ili5cXEdUj9Bm3Wpb37aq2tRV/wBAcHtNB66dnw6MD/fTk7Gh4zpE+
8vNrYqoMMcmJVXcljbNoYLbt3i5WBzeAKQCeP9m/6NcNl6GzR/9XeMKP9eDH
b6lw2//9iZ4weQu1qZxYaBAmUkHHO8SOExqoCvDqgHMQ3CedFp52jinFTcuA
c+NOsiOKoFMGaKp8EV+oBp5UDfQGZ/KZaJNdWor3wrBg5V9AukvHGiM4aeE+
attkgyect2mUC8iX157XBtisC95r1teokDDLJhaJ1Lhg5GATCAlRfpScgvHr
NvBIsFG6TRROLa7jDCRKVGyTAs0rDLPOoM3Eg5slQTvn+4H1P91v+f/AwDtt
DLyOdc5NYRVo3DmSLUfBDb4d8bN96DofDnkMVIvB5Xba9iSF3ESBJYnXq2jU
uog8hXAlXc3kjjAq0vlTKWL70f91irh/W1Ot/xt5wbkbdnQjOm8CrISAKWUP
JcvGve3Nz1iBmcEDutXVwWt5L0jXYimyrmLjC5MkqLgl56vQLox/DN6mU201
w0vS9Gkk0H21kwqQe9sVIOUMuHRl66manBWd0AKl3LC7u0j/lnJON4g8p0om
kyJNKc11hO4MPaUMWD0F1o9K5etthPTb2JYXkmv5f57djPPSDTUPrV8w0lqS
LvTuHX1fPaScPErU8W53SAR56JBg2D9HpXwNxX85BLcWCvsUCDCQqt8TmU5u
acr2f6fno0/Y1jdEq97/TAi+4tIJLxcO96+F5HMh+IJTONQn0ykm7POUXlrx
J4T81k/hsyEgYcxU1Rbdq2IsVBPBMgszkFxKNsdmOJxTXqfgAa6L9PYKDXRX
PJHW8MqpfKNtsvVqjpsmGrn87/IKiCuc6JpHLngce8mdjne1A07Ze401FfT7
Nv1+n36/B4d2+pB+3cFfH70HR8HdWrXhtFJtXr5aXgv6YMFZSRmt0cr9UMKb
5BLUstl4bkmykBQidL1+LcIUY0ku1c4tqOpJFn3OPahCfNOC6sbgNZaNhdO6
ysO0yiWprYp0KS5sUi53rd+WSvgwTZGmzCQvTD2yW4X3QqE5xRf/VMqBgd05
FrbMsbCFXnd1p1q4Q/pyoJ3GwloXygwYvt7bkwt/vbEj97B0RcslMJv/omkA
uIFPU3XfSiITBE4e3dJ9wsHrg7Of6SDl7N9aSZpZS3YfnY7unp6O+CT71cj7
N458DSMbo9WTMDbpCsUEWmBV3lOnygHuMEdhiqKdFHlZtVkmvLk4KCV+g+GV
8cpLzq/THP+riBQ51pWPkIk0rO4f/zKRmhB8Xdhs588wkc5LhfOpDPmFkNQN
FM7AwjO9nandhD9or/Dv9lj8lk2KG5W3nLtT3U6Bi+IeDvec3j45fTVCDV2d
4rpxUdkUmJwWppif7F/ioUIpkda/HXXvnW+rwhdmLpOIZE6Gi5UYZfkZSmNx
OeAEO2LOv6Ll2yx3K0oHv9GqrvnSsdLXawtypZ3Ttl5mhdzBoTbmUy6VeJ+1
OJwIv9lVPd2pIGlqcPxObmC7aEFOw99geyTgedDI3U7uUQ2q3hiO9jYdfC6r
BytYDkcnerCz8+hB794FrHmdKsAYmbcoDq2Bt6Yd/F3XtYO+Tjvg/L5eQIQg
XX4SRujDa1BC79zZZP72uZD7x4c79z98EEXkZ7z8a1o/As114eJWYXvrEPzJ
Ouz+NVc/tfykxr2wH1ySe2GXYayzeZEHWH2ItL12z13WduIfrjTFS4Jeq2zz
wahdP7D5ZV0lBTocOJHUAlNsC756dnJK2eu1mr4y57A9m8O71JZwdpXpbJq5
ziiRGpnopUCyqaW7KMmWoI9834PCaZ4FaTAR46wqna2F10wLKlyDAkm2zvxq
CqomqcBRN1fWdCWgxyXyLVU0GPVTjeC/8DZlrR25pLW/jM1vwKgg/Er77NsZ
m8kSVNs/Mx53gK1PpGPHxwD5Rpcb+msM7tuCYL9wRdSfer3yFRB4GX+3ZIvf
hnn6UXsZA34yib2kQI3QbdvNzhD7pvyK9+21bDYpIpXLmak12Juhr49MrQME
x6iqChFMLZLEJj99rUxZqwSwUnVK/kzQaF9lqyMChyR4CTF9gCZXUFSVsKqi
nLb1Ave2tkjogpeI1jKA2dd7HHDb6tey9loN9zrexEpAWF0TDdeIyUTlzlhX
9CXnjnLt/JxGTHumWrgjbiDAPTsEcNT0GhtMlX5JrckSuXdGOkh5LahqnZ+M
ujIrv/ETN8njQTCrlN6SLm2WrXPIEm/GJ/PQcs2+tGoxTAywUaqbbK1Zl9vv
9dpsqovm2my/+lLaVxUZWSy20RHML9Z1+C5v7yW9QzHY+lXGDRvK6b3+Vq5J
YBcootcoP+7VcEcR3NRSPgovXxWIczE2pdy4FhZcwLkEbrd5JqkwS5tSeBSM
PjWZG+xyBLjGniNypIBcQiC1UgkXYWRoVmpBRoku53ungse1OvJr0Kjb696p
NwJDByQcJTF1NIO1LuuZE7iLLvlrshW6flVSrivZP1LOL4hjIeKhjspe0OZM
ihnaWJhO4WZbmJXi+hdXCJzXMl2Z5qrCWvS7qEwda2bHUTKhpoqU+2diFdsZ
9b3Dzh+LJTeRokJcjB9Q+muKlZ+EaOAPOPeFkBnbhNhbrInXTrdqoYQ05DVB
K10E7F/j84uzZq8rQpWyaXb2qY1fxFI0wF4545WfboT7k3mUmMFVSolcZeH+
h35gwEjdvysQLKu00yKiops7XHdVVSK5LBkAda1Lks9tMJILM1w2WFV3xPX2
dJqH+15w4rqWLF2VFWR0w3eH+xsxJqngz95gU/+gB+ySCS+nnHkNMBQTaeHB
a1a9omTrsDSVqkpyngC74eV5Xd/Fg67yG4mdm90yBUACPmXin+vRwpXzThpx
dh/qmLze0AnLlsuGe07I+AIYK5ml24rXjQ+1U0qVs1w91ujywccnLpnNq3pq
18rAVYTLPHCsmdS+KtqYnU7xoIlryGUshRhlsGeu/L9s2oJpg1xXW3F8GFNS
YJ+NGCp6za7xu7rYGhK34npDSepjqdaoBSshXfEQwIaoLWDwhH1vQq5LpnoO
249IXw5jLOCicnE+CW7ckJU5IM6xQ0MBezWiRfAbyGC62SMAyoSXPFFYBLu0
3G+Ic0VQCxjCGE0JFkxaSxfBfkftiR+bXUX9A0RGSz0zf17PBtkUullQ04Yl
SIZlSpv2sqBlG0SVWOhYxSP9XGrqXMJb9XJr2IH1u7nx5tD9ntqSv7DQF7sO
eNsDyTEAvdiSAw5PQy+LpybG11LT5cTrRYn1Gq+dnt47OT4+2MNKh1/ODl7A
b7Bar5Er5LUrRC3kavsb9QbSYgO5zWRoMpY111Y2XpVOYxstL+XGj+9NMDRY
2lvS80ESjYxTKJyvbrBUAhR7gSqcdgSo226gbu9k/2DvsxCHr6iJimpIAemm
eA8R1JIe375IKVvRdET0kY2bU4NFiXowGXF+VcotmzCgkmWF5L0aNoa/zzjB
l5mSu5lS10C2UbEDApCVdS+alUyYIlYOomPgITkbE1heU6kqLmnfePcOy9B7
fOFMcho/4UQvPrqNdVH+0T6sOKDWqKyrmm3KMJW4JtUk9izJn0WM9RrYAwJ+
LZvhMWoy17rTNXuNrCqJyfKU1AoWM8L1BYf0w+CiK8YHiOUipe6oLUQkfN0K
WxCCc5MGtUsD0tLXQEv4pYMc2UmRombak2Yyrhdk3fNAHcZxxcwN8ExscUTW
+yqt4dITatJHi05/jtLdUouIjNUZ2kViqY8pIVE69mWuuQ6R5gxbD6OFUiVu
ZJ+w5nVLSqWTnz+uajLL6yji08tQ3hJTSc8lOP8ilbwRTox3HRpqLqbrnsfq
ge8csDcJEPxkXuV2yBA/l7Orq/r3qjqeZ/gKHBDaXWuOKvtdgZSb2bYNOKGA
rZ6Hx8M1QsLe6LtUl1bxNqB3Bho5XXV9hVc99C1tZlzlXrp+12Ow7HDNYVk/
Rdyl3u1yJb4NHnemoBNt54PrY0xGR8hahRrRAjoX1HvGJkvsMf86AtdjNC/G
IXg0VZfzrj4KYcv6pfndAKV31RMbw6iXRYJfgYV6VkynXf1TQtVuOOo8S1by
7k3RV/8AQYpsDsdeAAA=

-->

</rfc>
