Donate to e Foundation | Murena handsets with /e/OS | Own a part of Murena! Learn more

Commit 4f1b24b3 authored by Kevin Hufnagle's avatar Kevin Hufnagle Committed by android-build-merger
Browse files

docs: Created content describing high-performance audio. am: 44aff87b am:...

docs: Created content describing high-performance audio. am: 44aff87b am: 52648a79 am: 7ac4b0bf am: 2359b628
am: 18267e9b

* commit '18267e9b':
  docs: Created content describing high-performance audio.

Change-Id: I4bf805d7058b43c334b8c79079c00fc2f65dc5e9
parents 055d0dcf 18267e9b
Loading
Loading
Loading
Loading
+10 −0
Original line number Diff line number Diff line
@@ -60,6 +60,16 @@ toc:
    path: /ndk/guides/audio/basics.html
  - title: OpenSL ES for Android
    path: /ndk/guides/audio/opensl-for-android.html
  - title: Audio Input Latency
    path: /ndk/guides/audio/input-latency.html
  - title: Audio Output Latency
    path: /ndk/guides/audio/output-latency.html
  - title: Floating-Point Audio
    path: /ndk/guides/audio/floating-point.html
  - title: Sample Rates
    path: /ndk/guides/audio/sample-rates.html
  - title: OpenSL ES Programming Notes
    path: /ndk/guides/audio/opensl-prog-notes.html

- title: Vulkan
  path: /ndk/guides/graphics/index.html
+58 −14
Original line number Diff line number Diff line
page.title=OpenSL ES™ Basics
page.title=High-Performance Audio Basics
@jd:body

<div id="qv-wrapper">
@@ -6,26 +6,51 @@ page.title=OpenSL ES™ Basics
      <h2>On this page</h2>

      <ol>
        <li><a href="#overview">Building Great Audio Apps</a></li>
        <li><a href="#adding">Adding OpenSL ES to Your App</a></li>
        <li><a href="#building">Building and Debugging</a></li>
        <li><a href="#power">Audio Power Consumption</a></li>
        <li><a href="#samples">Samples</a></li>
      </ol>
    </div>
  </div>

<a href="https://www.youtube.com/watch?v=d3kfEeMZ65c" class="notice-developers-video">
<div>
    <h3>Video</h3>
    <p>Google I/O 2013 - High Performance Audio</p>
</div>
</a>

<p>
The Khronos Group's OpenSL ES standard exposes audio features
The Khronos Group's OpenSL ES standard exposes audio features
similar to those in the {@link android.media.MediaPlayer} and {@link android.media.MediaRecorder}
APIs in the Android Java framework. OpenSL ES provides a C language interface as well as
C++ bindings, allowing you to call it from code written in either language.
</p>

<p>
This page describes how to add these audio APIs into your app's source code, and how to incorporate
them into the build process.
This page describes the typical use cases for these high-performance audio APIs, how to add them
into your app's source code, and how to incorporate them into the build process.
</p>

<h2 id="overview">Building Great Audio Apps</h2>

<p>
The OpenSL ES APIs are available to help you develop and improve your app's audio performance.
 Some typical use cases include the following:</p>

<ul>
  <li>Digital Audio Workstations (DAWs).</li>
  <li>Synthesizers.</li>
  <li>Drum machines.</li>
  <li>Music learning apps.</li>
  <li>Karaoke apps.</li>
  <li>DJ mixing.</li>
  <li>Audio effects.</li>
  <li>Video/audio conferencing.</li>
</ul>

<h2 id="adding">Adding OpenSL ES to your App</h2>

<p>
@@ -45,6 +70,18 @@ Android extensions</a> as well, include the {@code OpenSLES_Android.h} header fi
#include &lt;SLES/OpenSLES_Android.h&gt;
</pre>

<p>
When you include the {@code OpenSLES_Android.h} header file, the following headers are included
automatically:
</p>
<pre>
#include &lt;SLES/OpenSLES_AndroidConfiguration.h&gt;
#include &lt;SLES/OpenSLES_AndroidMetadata.h&gt;
</pre>

<p class="note"><strong>Note: </strong>
These headers are not required, but are shown as an aid in learning the API.
</p>

<h2 id="building">Building and Debugging</h2>

@@ -69,9 +106,9 @@ for a given use case.
</p>

<p>
We use asserts in our <a href="https://github.com/googlesamples/android-ndk">examples</a>, because
they help catch unrealistic conditions that would indicate a coding error. We have used explicit
error handling for other conditions more likely to occur in production.
We use asserts in our <a class="external-link" href="https://github.com/googlesamples/android-ndk">
examples</a>, because they help catch unrealistic conditions that would indicate a coding error. We
have used explicit error handling for other conditions more likely to occur in production.
</p>

<p>
@@ -91,18 +128,25 @@ $ adb logcat
</pre>

<p>
To examine the log from Android Studio, either click the <em>Logcat</em> tab in the
<a href="{@docRoot}tools/debugging/debugging-studio.html#runDebug"><em>Debug</em></a>
window, or click the <em>Devices | logcat</em> tab in the
<a href="{@docRoot}tools/debugging/debugging-studio.html#systemLogView"><em>Android DDMS</em></a>
To examine the log from Android Studio, either click the <strong>Logcat</strong> tab in the
<a href="{@docRoot}tools/debugging/debugging-studio.html#runDebug">Debug</a>
window, or click the <strong>Devices | logcat</strong> tab in the
<a href="{@docRoot}tools/debugging/debugging-studio.html#systemLogView">Android DDMS</a>
window.
</p>

<h2 id="power">Audio Power Consumption</h2>
<p>Constantly outputting audio incurs significant power consumption. Ensure that you stop the
 output in the
 <a href="{@docRoot}reference/android/app/Activity.html#onPause()">onPause()</a> method.
 Also consider pausing the silent output after some period of user inactivity.
</p>
<h2 id="samples">Samples</h2>

<p>
Supported and tested example code that you can use as a model for your own code resides both locally
and on GitHub. The local examples are located in
and on
<a class="external-link" href="https://github.com/googlesamples/android-audio-high-performance/">
GitHub</a>. The local examples are located in
{@code platforms/android-9/samples/native-audio/}, under your NDK root installation directory.
On GitHub, they are available from the
<a class="external-link" href="https://github.com/googlesamples/android-ndk">{@code android-ndk}</a>
@@ -122,4 +166,4 @@ Android app.
For more information on differences between the reference specification and the
Android implementation, see
<a href="{@docRoot}ndk/guides/audio/opensl-for-android.html">
OpenSL ES for Android</a>.
OpenSL ES for Android</a>.
+101 −0
Original line number Diff line number Diff line
page.title=Floating-Point Audio
@jd:body

<div id="qv-wrapper">
    <div id="qv">
      <h2>On this page</h2>

      <ol>
        <li><a href="#best">Best Practices for Floating-Point Audio</a></li>
        <li><a href="#support">Floating-Point Audio in Android SDK</a></li>
        <li><a href="#more">For More Information</a></li>
      </ol>
    </div>
  </div>

<a href="https://www.youtube.com/watch?v=sIcieUqMml8" class="notice-developers-video">
<div>
    <h3>Video</h3>
    <p>Will it Float? The Glory and Shame of Floating-Point Audio</p>
</div>
</a>

<p>Using floating-point numbers to represent audio data can significantly enhance audio
 quality in high-performance audio applications. Floating point offers the following
 advantages:</p>

<ul>
<li>Wider dynamic range.</li>
<li>Consistent accuracy across the dynamic range.</li>
<li>More headroom to avoid clipping during intermediate calculations and transients.</li>
</ul>

<p>While floating-point can enhance audio quality, it does present certain disadvantages:</p>

<ul>
<li>Floating-point numbers use more memory.</li>
<li>Floating-point operations employ unexpected properties, for example, addition is
 not associative.</li>
<li>Floating-point calculations can sometimes lose arithmetic precision due to rounding or
 numerically unstable algorithms.</li>
<li>Using floating-point effectively requires greater understanding to achieve accurate
 and reproducible results.</li>
</ul>

<p>
  Formerly, floating-point was notorious for being unavailable or slow. This is
  still true for low-end and embedded processors. But processors on modern
  mobile devices now have hardware floating-point with performance that is
  similar (or in some cases even faster) than integer. Modern CPUs also support
  <a href="http://en.wikipedia.org/wiki/SIMD" class="external-link">SIMD</a>
  (Single instruction, multiple data), which can improve performance further.
</p>

<h2 id="best">Best Practices for Floating-Point Audio</h2>
<p>The following best practices help you avoid problems with floating-point calculations:</p>
<ul>
<li>Use double precision floating-point for infrequent calculations,
such as computing filter coefficients.</li>
<li>Pay attention to the order of operations.</li>
<li>Declare explicit variables for intermediate values.</li>
<li>Use parentheses liberally.</li>
<li>If you get a NaN or infinity result, use binary search to discover
where it was introduced.</li>
</ul>

<h2 id="support">Floating-Point Audio in Android SDK</h2>

<p>For floating-point audio, the audio format encoding
 <code>AudioFormat.ENCODING_PCM_FLOAT</code> is used similarly to
 <code>ENCODING_PCM_16_BIT</code> or <code>ENCODING_PCM_8_BIT</code> for specifying
 AudioTrack data
formats. The corresponding overloaded method <code>AudioTrack.write()</code>
 takes in a float array to deliver data.</p>

<pre>
   public int write(float[] audioData,
        int offsetInFloats,
        int sizeInFloats,
        int writeMode)
</pre>

<h2 id="more">For More Information</h2>

<p>The following Wikipedia pages are helpful in understanding floating-point audio:</p>

<ul>
<li><a href="http://en.wikipedia.org/wiki/Audio_bit_depth" class="external-link" >Audio bit depth</a></li>
<li><a href="http://en.wikipedia.org/wiki/Floating_point" class="external-link" >Floating point</a></li>
<li><a href="http://en.wikipedia.org/wiki/IEEE_floating_point" class="external-link" >IEEE 754 floating-point</a></li>
<li><a href="http://en.wikipedia.org/wiki/Loss_of_significance" class="external-link" >Loss of significance</a>
 (catastrophic cancellation)</li>
<li><a href="https://en.wikipedia.org/wiki/Numerical_stability" class="external-link" >Numerical stability</a></li>
</ul>

<p>The following article provides information on those aspects of floating-point that have a
 direct impact on designers of computer systems:</p>
<ul>
<li><a href="http://docs.oracle.com/cd/E19957-01/806-3568/ncg_goldberg.html" class="external-link" >What every
 computer scientist should know about floating-point arithmetic</a>
by David Goldberg, Xerox PARC (edited reprint).</li>
</ul>
+22 −10
Original line number Diff line number Diff line
page.title=NDK Audio: OpenSL ES&#8482
page.title=NDK High-Performance Audio
@jd:body

<p>The NDK package includes an Android-specific implementation of the
<a href="https://www.khronos.org/opensles/">OpenSL ES</a> API
specification from the <a href="https://www.khronos.org">Khronos Group</a>. This library
allows you to use C or C++ to implement high-performance, low-latency audio in your game or other
demanding app.</p>
<a class="external-link" href="https://www.khronos.org/opensles/">OpenSL ES™</a> API
specification from the <a class="external-link" href="https://www.khronos.org">Khronos Group</a>.
This library allows you to use C or C++ to implement high-performance, low-latency audio, whether
you are writing a synthesizer, digital audio workstation, karaoke, game,
 or other real-time app.</p>

<p>This section begins by providing some
<a href="{@docRoot}ndk/guides/audio/basics.html">basic information</a> about the API, including how
to incorporate it into your app. It then explains what you need to know about the
<a href="{@docRoot}ndk/guides/audio/opensl-for-android.html">Android-specific implementation</a>
of OpenSL ES, focusing on differences between this implementation and the reference specification.
<a href="{@docRoot}ndk/guides/audio/basics.html">basic information</a> about the API, including
typical use cases and how to incorporate it into your app. It then explains what you need to know
about the <a href="{@docRoot}ndk/guides/audio/opensl-for-android.html">Android-specific
implementation</a> of OpenSL ES, focusing on the differences between this implementation and the
reference specification. Next, you'll learn how to minimze
 <a href="{@docRoot}ndk/guides/audio/input-latency.html">input latency</a>
 when using built-in or external microphones
and some actions that you can take to minimize
 <a href="{@docRoot}ndk/guides/audio/output-latency.html">output latency</a>.
 It describes the reasons that you should use
 <a href="{@docRoot}ndk/guides/audio/floating-point.html">floating-point</a>
 numbers to represent your audio data, and it provides information that will help you choose the
optimal <a href="{@docRoot}ndk/guides/audio/sample-rates.html">sample rate</a>. This section
 concludes with some supplemental <a href="{@docRoot}ndk/guides/audio/opensl-prog-notes.html">
 programming notes</a> to ensure proper implementation of OpenSL ES.
 </p>
+95 −0
Original line number Diff line number Diff line
page.title=Audio Input Latency
@jd:body

<div id="qv-wrapper">
    <div id="qv">
      <h2>On this page</h2>

      <ol>
        <li><a href="#check-list">Checklist</a></li>
        <li><a href="#ways">Ways to Reduce Audio Input Latency</a></li>
        <li><a href="#avoid">What to Avoid</a></li>
      </ol>
    </div>
  </div>


<p>This page provides guidelines to help you reduce audio input latency when recording with a
built-in microphone or an external headset microphone.</p>

<h2 id="check-list">Checklist</h2>

<p>Here are a few important prerequisites:</p>

<ul>
  <li>You must use the Android-specific implementation of the
  <a class="external-link" href="https://www.khronos.org/opensles/">OpenSL ES™</a> API.

  <li>If you haven't already done so, download and install the
  <a href="{@docRoot}tools/sdk/ndk/index.html">Android NDK</a>.</li>

  <li>Many of the same requirements for low-latency audio output also apply to low-latency input,
  so read the requirements for low-latency output in
  <a href="{@docRoot}ndk/guides/audio/output-latency.html">Audio Output Latency</a>.</li>
</ul>

<h2 id="ways">Ways to Reduce Audio Input Latency</h2>

<p>The following are some methods to help ensure low audio input latency:

<ul>
  <li>Suggest to your users, if your app relies on low-latency audio, that they use a headset
(for example, by displaying a <em>Best with headphones</em> screen on first run). Note
that just using the headset doesn’t guarantee the lowest possible latency. You may need to
perform other steps to remove any unwanted signal processing from the audio path, such as by
using the <a href="http://developer.android.com/reference/android/media/MediaRecorder.AudioSource.html#VOICE_RECOGNITION">
VOICE_RECOGNITION</a> preset when recording.</li>

  <li>It's difficult to test audio input and output latency in isolation. The best solution to
determine the lowest possible audio input latency is to measure round-trip audio and divide
by two.</li>
 <li> Be prepared to handle nominal sample rates of 44,100 and 48,000 Hz as reported by
<a href="{@docRoot}reference/android/media/AudioManager.html#getProperty(java.lang.String)">
getProperty(String)</a> for
<a href="{@docRoot}reference/android/media/AudioManager.html#PROPERTY_OUTPUT_SAMPLE_RATE">
PROPERTY_OUTPUT_SAMPLE_RATE</a>. Other sample rates are possible, but rare.</li>

  <li>Be prepared to handle the buffer size reported by
<a href="{@docRoot}reference/android/media/AudioManager.html#getProperty(java.lang.String)">
getProperty(String)</a> for
<a href="{@docRoot}reference/android/media/AudioManager.html#PROPERTY_OUTPUT_FRAMES_PER_BUFFER">
PROPERTY_OUTPUT_FRAMES_PER_BUFFER</a>. Typical buffer sizes include 96, 128, 160, 192, 240, 256,
or 512 frames, but other values are possible.</li>
</ul>

<h2 id="avoid">What to Avoid</h2>

<p>Be sure to take these things into account to help avoid latency issues:</p>

<ul>
  <li>Don’t assume that the speakers and microphones used in mobile devices generally have good
acoustics. Due to their small size, the acoustics are generally poor so signal processing is
added to improve the sound quality. This signal processing introduces latency.</li>

  <li>Don't assume that your input and output callbacks are synchronized. For simultaneous input
and output, separate buffer queue completion handlers are used for each side. There is no
guarantee of the relative order of these callbacks or the synchronization of the audio clocks,
even when both sides use the same sample rate. Your application should buffer the data with
proper buffer synchronization.</li>

  <li>Don't assume that the actual sample rate exactly matches the nominal sample rate. For
example, if the nominal sample rate is 48,000 Hz, it is normal for the audio clock to advance
at a slightly different rate than the operating system {@code CLOCK_MONOTONIC}. This is because
the audio and system clocks may derive from different crystals.</li>

  <li>Don't assume that the actual playback sample rate exactly matches the actual capture sample
rate, especially if the endpoints are on separate paths. For example, if you are capturing from
the on-device microphone at 48,000 Hz nominal sample rate, and playing on USB audio
at 48,000 Hz nominal sample rate, the actual sample rates are likely to be slightly different
from each other.</li>
</ul>

<p>A consequence of potentially independent audio clocks is the need for asynchronous sample rate
conversion. A simple (though not ideal for audio quality) technique for asynchronous sample rate
conversion is to duplicate or drop samples as needed near a zero-crossing point. More
sophisticated conversions are possible.</p>
Loading