MediaCodecpublic final class MediaCodec extends Object MediaCodec class can be used to access low-level media codec, i.e.
encoder/decoder components.
MediaCodec is generally used like this:
MediaCodec codec = MediaCodec.createDecoderByType(type);
codec.configure(format, ...);
codec.start();
// if API level <= 20, get input and output buffer arrays here
ByteBuffer[] inputBuffers = codec.getInputBuffers();
ByteBuffer[] outputBuffers = codec.getOutputBuffers();
for (;;) {
int inputBufferIndex = codec.dequeueInputBuffer(timeoutUs);
if (inputBufferIndex >= 0) {
// if API level >= 21, get input buffer here
ByteBuffer inputBuffer = codec.getInputBuffer(inputBufferIndex);
// fill inputBuffers[inputBufferIndex] with valid data
...
codec.queueInputBuffer(inputBufferIndex, ...);
}
int outputBufferIndex = codec.dequeueOutputBuffer(timeoutUs);
if (outputBufferIndex >= 0) {
// if API level >= 21, get output buffer here
ByteBuffer outputBuffer = codec.getOutputBuffer(outputBufferIndex);
// outputBuffer is ready to be processed or rendered.
...
codec.releaseOutputBuffer(outputBufferIndex, ...);
} else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
// no needed to handle if API level >= 21 and using getOutputBuffer(int)
outputBuffers = codec.getOutputBuffers();
} else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
// Subsequent data will conform to new format.
// can ignore if API level >= 21 and using getOutputFormat(outputBufferIndex)
MediaFormat format = codec.getOutputFormat();
...
}
}
codec.stop();
codec.release();
codec = null;
Each codec maintains a number of input and output buffers that are
referred to by index in API calls.
For API levels 20 and below:
The contents of these buffers are represented by the ByteBuffer[] arrays
accessible through {@link #getInputBuffers} and {@link #getOutputBuffers}.
After a successful call to {@link #start} the client "owns" neither
input nor output buffers, subsequent calls to {@link #dequeueInputBuffer}
and {@link #dequeueOutputBuffer} then transfer ownership from the codec
to the client.
The client is not required to resubmit/release buffers immediately
to the codec, the sample code above simply does this for simplicity's sake.
Nonetheless, it is possible that a codec may hold off on generating
output buffers until all outstanding buffers have been
released/resubmitted.
Once the client has an input buffer available it can fill it with data
and submit it it to the codec via a call to {@link #queueInputBuffer}.
Do not submit multiple input buffers with the same timestamp (unless
it is codec-specific data marked as such using the flag
{@link #BUFFER_FLAG_CODEC_CONFIG}).
The codec in turn will return an output buffer to the client in response
to {@link #dequeueOutputBuffer}. After the output buffer has been processed
a call to {@link #releaseOutputBuffer} will return it to the codec.
If a video surface has been provided in the call to {@link #configure},
{@link #releaseOutputBuffer} optionally allows rendering of the buffer
to the surface.
Input buffers (for decoders) and Output buffers (for encoders) contain
encoded data according to the format's type. For video types this data
is all the encoded data representing a single moment in time, for audio
data this is slightly relaxed in that a buffer may contain multiple
encoded frames of audio. In either case, buffers do not start and end on
arbitrary byte boundaries, this is not a stream of bytes, it's a stream
of access units.
Most formats also require the actual data to be prefixed by a number
of buffers containing setup data, or codec specific data, i.e. the
first few buffers submitted to the codec object after starting it must
be codec specific data marked as such using the flag {@link #BUFFER_FLAG_CODEC_CONFIG}
in a call to {@link #queueInputBuffer}.
Codec specific data included in the format passed to {@link #configure}
(in ByteBuffer entries with keys "csd-0", "csd-1", ...) is automatically
submitted to the codec, this data MUST NOT be submitted explicitly by the
client.
Once the client reaches the end of the input data it signals the end of
the input stream by specifying a flag of {@link #BUFFER_FLAG_END_OF_STREAM} in the call to
{@link #queueInputBuffer}. The codec will continue to return output buffers
until it eventually signals the end of the output stream by specifying
the same flag ({@link #BUFFER_FLAG_END_OF_STREAM}) on the BufferInfo returned in
{@link #dequeueOutputBuffer}. Do not submit additional input buffers after
signaling the end of the input stream, unless the codec has been flushed,
or stopped and restarted.
Seeking & Adaptive Playback Support
You can check if a decoder supports adaptive playback via {@link
MediaCodecInfo.CodecCapabilities#isFeatureSupported}. Adaptive playback
is only supported if you configure the codec to decode onto a {@link
android.view.Surface}.
For decoders that do not support adaptive playback (including
when not decoding onto a Surface)
In order to start decoding data that's not adjacent to previously submitted
data (i.e. after a seek) one must {@link #flush} the decoder.
Any input or output buffers the client may own at the point of the flush are
immediately revoked, i.e. after a call to {@link #flush} the client does not
own any buffers anymore.
It is important that the input data after a flush starts at a suitable
stream boundary. The first frame must be able to be decoded completely on
its own (for most codecs this means an I-frame), and that no frames should
refer to frames before that first new frame.
Note that the format of the data submitted after a flush must not change,
flush does not support format discontinuities,
for this a full {@link #stop}, {@link #configure configure()}, {@link #start}
cycle is necessary.
For decoders that support adaptive playback
In order to start decoding data that's not adjacent to previously submitted
data (i.e. after a seek) it is not necessary to {@link #flush} the
decoder.
It is still important that the input data after the discontinuity starts
at a suitable stream boundary (e.g. I-frame), and that no new frames refer
to frames before the first frame of the new input data segment.
For some video formats it is also possible to change the picture size
mid-stream. To do this for H.264, the new Sequence Parameter Set (SPS) and
Picture Parameter Set (PPS) values must be packaged together with an
Instantaneous Decoder Refresh (IDR) frame in a single buffer, which then
can be enqueued as a regular input buffer.
The client will receive an {@link #INFO_OUTPUT_FORMAT_CHANGED} return
value from {@link #dequeueOutputBuffer dequeueOutputBuffer()} or
{@link Callback#onOutputBufferAvailable onOutputBufferAvailable()}
just after the picture-size change takes place and before any
frames with the new size have been returned.
Be careful when calling {@link #flush} shortly after you have changed
the picture size. If you have not received confirmation of the picture
size change, you will need to repeat the request for the new picture size.
E.g. for H.264 you will need to prepend the PPS/SPS to the new IDR
frame to ensure that the codec receives the picture size change request.
States and error handling
During its life, a codec conceptually exists in one of the following states:
Initialized, Configured, Executing, Error, Uninitialized, (omitting transitory states
between them). When created by one of the factory methods,
the codec is in the Initialized state; {@link #configure} brings it to the
Configured state; {@link #start} brings it to the Executing state.
In the Executing state, decoding or encoding occurs through the buffer queue
manipulation described above. The method {@link #stop}
returns the codec to the Initialized state, whereupon it may be configured again,
and {@link #release} brings the codec to the terminal Uninitialized state. When
a codec error occurs, the codec moves to the Error state. Use {@link #reset} to
bring the codec back to the Initialized state, or {@link #release} to move it
to the Uninitialized state.
The factory methods
{@link #createByCodecName},
{@link #createDecoderByType},
and {@link #createEncoderByType}
throw {@link java.io.IOException} on failure which
the caller must catch or declare to pass up.
MediaCodec methods throw {@link java.lang.IllegalStateException}
when the method is called from a codec state that does not allow it;
this is typically due to incorrect application API usage.
Methods involving secure buffers may throw
{@link MediaCodec.CryptoException#MediaCodec.CryptoException}, which
has further error information obtainable from {@link MediaCodec.CryptoException#getErrorCode}.
Internal codec errors result in a {@link MediaCodec.CodecException},
which may be due to media content corruption, hardware failure, resource exhaustion,
and so forth, even when the application is correctly using the API.
The recommended action when receiving a {@link MediaCodec.CodecException} can be determined by
calling {@link MediaCodec.CodecException#isRecoverable} and
{@link MediaCodec.CodecException#isTransient}.
If {@link MediaCodec.CodecException#isRecoverable} returns true,
then a {@link #stop}, {@link #configure}, and {@link #start} can be performed to recover.
If {@link MediaCodec.CodecException#isTransient} returns true,
then resources are temporarily unavailable and the method may be retried at a later time.
If both {@link MediaCodec.CodecException#isRecoverable}
and {@link MediaCodec.CodecException#isTransient} return false,
then the {@link MediaCodec.CodecException} is fatal and the codec must be
{@link #reset reset} or {@link #release released}.
Both {@link MediaCodec.CodecException#isRecoverable} and
{@link MediaCodec.CodecException#isTransient} do not return true at the same time. |
Fields Summary |
---|
public static final int | BUFFER_FLAG_SYNC_FRAMEThis indicates that the (encoded) buffer marked as such contains
the data for a key frame. | public static final int | BUFFER_FLAG_KEY_FRAMEThis indicates that the (encoded) buffer marked as such contains
the data for a key frame. | public static final int | BUFFER_FLAG_CODEC_CONFIGThis indicated that the buffer marked as such contains codec
initialization / codec specific data instead of media data. | public static final int | BUFFER_FLAG_END_OF_STREAMThis signals the end of stream, i.e. no buffers will be available
after this, unless of course, {@link #flush} follows. | private EventHandler | mEventHandler | private Callback | mCallback | private static final int | EVENT_CALLBACK | private static final int | EVENT_SET_CALLBACK | private static final int | CB_INPUT_AVAILABLE | private static final int | CB_OUTPUT_AVAILABLE | private static final int | CB_ERROR | private static final int | CB_OUTPUT_FORMAT_CHANGE | public static final int | CONFIGURE_FLAG_ENCODEIf this codec is to be used as an encoder, pass this flag. | public static final int | CRYPTO_MODE_UNENCRYPTED | public static final int | CRYPTO_MODE_AES_CTR | public static final int | INFO_TRY_AGAIN_LATERIf a non-negative timeout had been specified in the call
to {@link #dequeueOutputBuffer}, indicates that the call timed out. | public static final int | INFO_OUTPUT_FORMAT_CHANGEDThe output format has changed, subsequent data will follow the new
format. {@link #getOutputFormat()} returns the new format. Note, that
you can also use the new {@link #getOutputFormat(int)} method to
get the format for a specific output buffer. This frees you from
having to track output format changes. | public static final int | INFO_OUTPUT_BUFFERS_CHANGEDThe output buffers have changed, the client must refer to the new
set of output buffers returned by {@link #getOutputBuffers} from
this point on. | private ByteBuffer[] | mCachedInputBuffers | private ByteBuffer[] | mCachedOutputBuffers | private final BufferMap | mDequeuedInputBuffers | private final BufferMap | mDequeuedOutputBuffers | private final Object | mBufferLock | public static final int | VIDEO_SCALING_MODE_SCALE_TO_FITThe content is scaled to the surface dimensions | public static final int | VIDEO_SCALING_MODE_SCALE_TO_FIT_WITH_CROPPINGThe content is scaled, maintaining its aspect ratio, the whole
surface area is used, content may be cropped | public static final String | PARAMETER_KEY_VIDEO_BITRATEChange a video encoder's target bitrate on the fly. The value is an
Integer object containing the new bitrate in bps. | public static final String | PARAMETER_KEY_SUSPENDTemporarily suspend/resume encoding of input data. While suspended
input data is effectively discarded instead of being fed into the
encoder. This parameter really only makes sense to use with an encoder
in "surface-input" mode, as the client code has no control over the
input-side of the encoder in that case.
The value is an Integer object containing the value 1 to suspend
or the value 0 to resume. | public static final String | PARAMETER_KEY_REQUEST_SYNC_FRAMERequest that the encoder produce a sync frame "soon".
Provide an Integer with the value 0. | private long | mNativeContext |
Constructors Summary |
---|
private MediaCodec(String name, boolean nameIsType, boolean encoder)
Looper looper;
if ((looper = Looper.myLooper()) != null) {
mEventHandler = new EventHandler(this, looper);
} else if ((looper = Looper.getMainLooper()) != null) {
mEventHandler = new EventHandler(this, looper);
} else {
mEventHandler = null;
}
mBufferLock = new Object();
native_setup(name, nameIsType, encoder);
|
Methods Summary |
---|
private final void | cacheBuffers(boolean input)
ByteBuffer[] buffers = null;
try {
buffers = getBuffers(input);
invalidateByteBuffers(buffers);
} catch (IllegalStateException e) {
// we don't get buffers in async mode
}
if (input) {
mCachedInputBuffers = buffers;
} else {
mCachedOutputBuffers = buffers;
}
| public void | configure(android.media.MediaFormat format, android.view.Surface surface, android.media.MediaCrypto crypto, int flags)Configures a component.
Map<String, Object> formatMap = format.getMap();
String[] keys = null;
Object[] values = null;
if (format != null) {
keys = new String[formatMap.size()];
values = new Object[formatMap.size()];
int i = 0;
for (Map.Entry<String, Object> entry: formatMap.entrySet()) {
if (entry.getKey().equals(MediaFormat.KEY_AUDIO_SESSION_ID)) {
int sessionId = 0;
try {
sessionId = (Integer)entry.getValue();
}
catch (Exception e) {
throw new IllegalArgumentException("Wrong Session ID Parameter!");
}
keys[i] = "audio-hw-sync";
values[i] = AudioSystem.getAudioHwSyncForSession(sessionId);
} else {
keys[i] = entry.getKey();
values[i] = entry.getValue();
}
++i;
}
}
native_configure(keys, values, surface, crypto, flags);
| public static android.media.MediaCodec | createByCodecName(java.lang.String name)If you know the exact name of the component you want to instantiate
use this method to instantiate it. Use with caution.
Likely to be used with information obtained from {@link android.media.MediaCodecList}
return new MediaCodec(
name, false /* nameIsType */, false /* unused */);
| public static android.media.MediaCodec | createDecoderByType(java.lang.String type)Instantiate a decoder supporting input data of the given mime type.
The following is a partial list of defined mime types and their semantics:
- "video/x-vnd.on2.vp8" - VP8 video (i.e. video in .webm)
- "video/x-vnd.on2.vp9" - VP9 video (i.e. video in .webm)
- "video/avc" - H.264/AVC video
- "video/hevc" - H.265/HEVC video
- "video/mp4v-es" - MPEG4 video
- "video/3gpp" - H.263 video
- "audio/3gpp" - AMR narrowband audio
- "audio/amr-wb" - AMR wideband audio
- "audio/mpeg" - MPEG1/2 audio layer III
- "audio/mp4a-latm" - AAC audio (note, this is raw AAC packets, not packaged in LATM!)
- "audio/vorbis" - vorbis audio
- "audio/g711-alaw" - G.711 alaw audio
- "audio/g711-mlaw" - G.711 ulaw audio
return new MediaCodec(type, true /* nameIsType */, false /* encoder */);
| public static android.media.MediaCodec | createEncoderByType(java.lang.String type)Instantiate an encoder supporting output data of the given mime type.
return new MediaCodec(type, true /* nameIsType */, true /* encoder */);
| public final native android.view.Surface | createInputSurface()Requests a Surface to use as the input to an encoder, in place of input buffers. This
may only be called after {@link #configure} and before {@link #start}.
The application is responsible for calling release() on the Surface when
done.
The Surface must be rendered with a hardware-accelerated API, such as OpenGL ES.
{@link android.view.Surface#lockCanvas(android.graphics.Rect)} may fail or produce
unexpected results.
| public final int | dequeueInputBuffer(long timeoutUs)Returns the index of an input buffer to be filled with valid data
or -1 if no such buffer is currently available.
This method will return immediately if timeoutUs == 0, wait indefinitely
for the availability of an input buffer if timeoutUs < 0 or wait up
to "timeoutUs" microseconds if timeoutUs > 0.
int res = native_dequeueInputBuffer(timeoutUs);
if (res >= 0) {
synchronized(mBufferLock) {
validateInputByteBuffer(mCachedInputBuffers, res);
}
}
return res;
| public final int | dequeueOutputBuffer(android.media.MediaCodec$BufferInfo info, long timeoutUs)Dequeue an output buffer, block at most "timeoutUs" microseconds.
Returns the index of an output buffer that has been successfully
decoded or one of the INFO_* constants below.
int res = native_dequeueOutputBuffer(info, timeoutUs);
synchronized(mBufferLock) {
if (res == INFO_OUTPUT_BUFFERS_CHANGED) {
cacheBuffers(false /* input */);
} else if (res >= 0) {
validateOutputByteBuffer(mCachedOutputBuffers, res, info);
}
}
return res;
| protected void | finalize()
native_finalize();
| public final void | flush()Flush both input and output ports of the component, all indices
previously returned in calls to {@link #dequeueInputBuffer} and
{@link #dequeueOutputBuffer} become invalid.
If codec is configured in asynchronous mode, call {@link #start}
after {@code flush} has returned to resume codec operations. The
codec will not request input buffers until this has happened.
If codec is configured in synchronous mode, codec will resume
automatically if an input surface was created. Otherwise, it
will resume when {@link #dequeueInputBuffer} is called.
synchronized(mBufferLock) {
invalidateByteBuffers(mCachedInputBuffers);
invalidateByteBuffers(mCachedOutputBuffers);
mDequeuedInputBuffers.clear();
mDequeuedOutputBuffers.clear();
}
native_flush();
| private final void | freeAllTrackedBuffers()
synchronized(mBufferLock) {
freeByteBuffers(mCachedInputBuffers);
freeByteBuffers(mCachedOutputBuffers);
mCachedInputBuffers = null;
mCachedOutputBuffers = null;
mDequeuedInputBuffers.clear();
mDequeuedOutputBuffers.clear();
}
| private final void | freeByteBuffer(java.nio.ByteBuffer buffer)
if (buffer != null /* && buffer.isDirect() */) {
// all of our ByteBuffers are direct
java.nio.NioUtils.freeDirectBuffer(buffer);
}
| private final void | freeByteBuffers(java.nio.ByteBuffer[] buffers)
if (buffers != null) {
for (ByteBuffer buffer: buffers) {
freeByteBuffer(buffer);
}
}
| private final native java.nio.ByteBuffer | getBuffer(boolean input, int index)
| private final native java.nio.ByteBuffer[] | getBuffers(boolean input)
| public android.media.MediaCodecInfo | getCodecInfo()Get the codec info. If the codec was created by createDecoderByType
or createEncoderByType, what component is chosen is not known beforehand,
and thus the caller does not have the MediaCodecInfo.
return MediaCodecList.getInfoFor(getName());
| private final native java.util.Map | getFormatNative(boolean input)
| private final native android.media.Image | getImage(boolean input, int index)
| public java.nio.ByteBuffer | getInputBuffer(int index)Returns a {@link java.nio.Buffer#clear cleared}, writable ByteBuffer
object for a dequeued input buffer index to contain the input data.
After calling this method any ByteBuffer or Image object
previously returned for the same input index MUST no longer
be used.
ByteBuffer newBuffer = getBuffer(true /* input */, index);
synchronized(mBufferLock) {
invalidateByteBuffer(mCachedInputBuffers, index);
mDequeuedInputBuffers.put(index, newBuffer);
}
return newBuffer;
| public java.nio.ByteBuffer[] | getInputBuffers()Retrieve the set of input buffers. Call this after start()
returns. After calling this method, any ByteBuffers
previously returned by an earlier call to this method MUST no
longer be used.
if (mCachedInputBuffers == null) {
throw new IllegalStateException();
}
// FIXME: check codec status
return mCachedInputBuffers;
| public final android.media.MediaFormat | getInputFormat()Call this after {@link #configure} returns successfully to
get the input format accepted by the codec. Do this to
determine what optional configuration parameters were
supported by the codec.
return new MediaFormat(getFormatNative(true /* input */));
| public android.media.Image | getInputImage(int index)Returns a writable Image object for a dequeued input buffer
index to contain the raw input video frame.
After calling this method any ByteBuffer or Image object
previously returned for the same input index MUST no longer
be used.
Image newImage = getImage(true /* input */, index);
synchronized(mBufferLock) {
invalidateByteBuffer(mCachedInputBuffers, index);
mDequeuedInputBuffers.put(index, newImage);
}
return newImage;
| public final native java.lang.String | getName()Get the component name. If the codec was created by createDecoderByType
or createEncoderByType, what component is chosen is not known beforehand.
| public java.nio.ByteBuffer | getOutputBuffer(int index)Returns a read-only ByteBuffer for a dequeued output buffer
index. The position and limit of the returned buffer are set
to the valid output data.
After calling this method, any ByteBuffer or Image object
previously returned for the same output index MUST no longer
be used.
ByteBuffer newBuffer = getBuffer(false /* input */, index);
synchronized(mBufferLock) {
invalidateByteBuffer(mCachedOutputBuffers, index);
mDequeuedOutputBuffers.put(index, newBuffer);
}
return newBuffer;
| public java.nio.ByteBuffer[] | getOutputBuffers()Retrieve the set of output buffers. Call this after start()
returns and whenever dequeueOutputBuffer signals an output
buffer change by returning {@link
#INFO_OUTPUT_BUFFERS_CHANGED}. After calling this method, any
ByteBuffers previously returned by an earlier call to this
method MUST no longer be used.
if (mCachedOutputBuffers == null) {
throw new IllegalStateException();
}
// FIXME: check codec status
return mCachedOutputBuffers;
| public final android.media.MediaFormat | getOutputFormat()Call this after dequeueOutputBuffer signals a format change by returning
{@link #INFO_OUTPUT_FORMAT_CHANGED}.
You can also call this after {@link #configure} returns
successfully to get the output format initially configured
for the codec. Do this to determine what optional
configuration parameters were supported by the codec.
return new MediaFormat(getFormatNative(false /* input */));
| public final android.media.MediaFormat | getOutputFormat(int index)Returns the output format for a specific output buffer.
return new MediaFormat(getOutputFormatNative(index));
| private final native java.util.Map | getOutputFormatNative(int index)
| public android.media.Image | getOutputImage(int index)Returns a read-only Image object for a dequeued output buffer
index that contains the raw video frame.
After calling this method, any ByteBuffer or Image object previously
returned for the same output index MUST no longer be used.
Image newImage = getImage(false /* input */, index);
synchronized(mBufferLock) {
invalidateByteBuffer(mCachedOutputBuffers, index);
mDequeuedOutputBuffers.put(index, newImage);
}
return newImage;
| private final void | invalidateByteBuffer(java.nio.ByteBuffer[] buffers, int index)
if (buffers != null && index >= 0 && index < buffers.length) {
ByteBuffer buffer = buffers[index];
if (buffer != null) {
buffer.setAccessible(false);
}
}
| private final void | invalidateByteBuffers(java.nio.ByteBuffer[] buffers)
if (buffers != null) {
for (ByteBuffer buffer: buffers) {
if (buffer != null) {
buffer.setAccessible(false);
}
}
}
| private final native void | native_configure(java.lang.String[] keys, java.lang.Object[] values, android.view.Surface surface, android.media.MediaCrypto crypto, int flags)
| private final native int | native_dequeueInputBuffer(long timeoutUs)
| private final native int | native_dequeueOutputBuffer(android.media.MediaCodec$BufferInfo info, long timeoutUs)
| private final native void | native_finalize()
| private final native void | native_flush()
| private static final native void | native_init()
| private final native void | native_queueInputBuffer(int index, int offset, int size, long presentationTimeUs, int flags)
| private final native void | native_queueSecureInputBuffer(int index, int offset, android.media.MediaCodec$CryptoInfo info, long presentationTimeUs, int flags)
| private final native void | native_release()
| private final native void | native_reset()
| private final native void | native_setCallback(android.media.MediaCodec$Callback cb)
| private final native void | native_setup(java.lang.String name, boolean nameIsType, boolean encoder)
| private final native void | native_start()
| private final native void | native_stop()
| private void | postEventFromNative(int what, int arg1, int arg2, java.lang.Object obj)
if (mEventHandler != null) {
Message msg = mEventHandler.obtainMessage(what, arg1, arg2, obj);
mEventHandler.sendMessage(msg);
}
| public final void | queueInputBuffer(int index, int offset, int size, long presentationTimeUs, int flags)After filling a range of the input buffer at the specified index
submit it to the component. Once an input buffer is queued to
the codec, it MUST NOT be used until it is later retrieved by
{@link #getInputBuffer} in response to a {@link #dequeueInputBuffer}
return value or a {@link Callback#onInputBufferAvailable}
callback.
Many decoders require the actual compressed data stream to be
preceded by "codec specific data", i.e. setup data used to initialize
the codec such as PPS/SPS in the case of AVC video or code tables
in the case of vorbis audio.
The class {@link android.media.MediaExtractor} provides codec
specific data as part of
the returned track format in entries named "csd-0", "csd-1" ...
These buffers can be submitted directly after {@link #start} or
{@link #flush} by specifying the flag {@link
#BUFFER_FLAG_CODEC_CONFIG}. However, if you configure the
codec with a {@link MediaFormat} containing these keys, they
will be automatically submitted by MediaCodec directly after
start. Therefore, the use of {@link
#BUFFER_FLAG_CODEC_CONFIG} flag is discouraged and is
recommended only for advanced users.
To indicate that this is the final piece of input data (or rather that
no more input data follows unless the decoder is subsequently flushed)
specify the flag {@link #BUFFER_FLAG_END_OF_STREAM}.
synchronized(mBufferLock) {
invalidateByteBuffer(mCachedInputBuffers, index);
mDequeuedInputBuffers.remove(index);
}
try {
native_queueInputBuffer(
index, offset, size, presentationTimeUs, flags);
} catch (CryptoException | IllegalStateException e) {
revalidateByteBuffer(mCachedInputBuffers, index);
throw e;
}
| public final void | queueSecureInputBuffer(int index, int offset, android.media.MediaCodec$CryptoInfo info, long presentationTimeUs, int flags)Similar to {@link #queueInputBuffer} but submits a buffer that is
potentially encrypted.
synchronized(mBufferLock) {
invalidateByteBuffer(mCachedInputBuffers, index);
mDequeuedInputBuffers.remove(index);
}
try {
native_queueSecureInputBuffer(
index, offset, info, presentationTimeUs, flags);
} catch (CryptoException | IllegalStateException e) {
revalidateByteBuffer(mCachedInputBuffers, index);
throw e;
}
| public final void | release()Make sure you call this when you're done to free up any opened
component instance instead of relying on the garbage collector
to do this for you at some point in the future.
freeAllTrackedBuffers(); // free buffers first
native_release();
| public final void | releaseOutputBuffer(int index, boolean render)If you are done with a buffer, use this call to return the buffer to
the codec. If you previously specified a surface when configuring this
video decoder you can optionally render the buffer.
Once an output buffer is released to the codec, it MUST NOT
be used until it is later retrieved by {@link #getOutputBuffer} in response
to a {@link #dequeueOutputBuffer} return value or a
{@link Callback#onOutputBufferAvailable} callback.
synchronized(mBufferLock) {
invalidateByteBuffer(mCachedOutputBuffers, index);
mDequeuedOutputBuffers.remove(index);
}
releaseOutputBuffer(index, render, false /* updatePTS */, 0 /* dummy */);
| public final void | releaseOutputBuffer(int index, long renderTimestampNs)If you are done with a buffer, use this call to update its surface timestamp
and return it to the codec to render it on the output surface. If you
have not specified an output surface when configuring this video codec,
this call will simply return the buffer to the codec.
The timestamp may have special meaning depending on the destination surface.
SurfaceView specifics |
If you render your buffer on a {@link android.view.SurfaceView},
you can use the timestamp to render the buffer at a specific time (at the
VSYNC at or after the buffer timestamp). For this to work, the timestamp
needs to be reasonably close to the current {@link System#nanoTime}.
Currently, this is set as within one (1) second. A few notes:
- the buffer will not be returned to the codec until the timestamp
has passed and the buffer is no longer used by the {@link android.view.Surface}.
- buffers are processed sequentially, so you may block subsequent buffers to
be displayed on the {@link android.view.Surface}. This is important if you
want to react to user action, e.g. stop the video or seek.
- if multiple buffers are sent to the {@link android.view.Surface} to be
rendered at the same VSYNC, the last one will be shown, and the other ones
will be dropped.
- if the timestamp is not "reasonably close" to the current system
time, the {@link android.view.Surface} will ignore the timestamp, and
display the buffer at the earliest feasible time. In this mode it will not
drop frames.
- for best performance and quality, call this method when you are about
two VSYNCs' time before the desired render time. For 60Hz displays, this is
about 33 msec.
|
Once an output buffer is released to the codec, it MUST NOT
be used until it is later retrieved by {@link #getOutputBuffer} in response
to a {@link #dequeueOutputBuffer} return value or a
{@link Callback#onOutputBufferAvailable} callback.
synchronized(mBufferLock) {
invalidateByteBuffer(mCachedOutputBuffers, index);
mDequeuedOutputBuffers.remove(index);
}
releaseOutputBuffer(
index, true /* render */, true /* updatePTS */, renderTimestampNs);
| private final native void | releaseOutputBuffer(int index, boolean render, boolean updatePTS, long timeNs)
| public final void | reset()Returns the codec to its initial (Initialized) state.
Call this if an {@link MediaCodec.CodecException#isRecoverable unrecoverable}
error has occured to reset the codec to its initial state after creation.
freeAllTrackedBuffers(); // free buffers first
native_reset();
| private final void | revalidateByteBuffer(java.nio.ByteBuffer[] buffers, int index)
synchronized(mBufferLock) {
if (buffers != null && index >= 0 && index < buffers.length) {
ByteBuffer buffer = buffers[index];
if (buffer != null) {
buffer.setAccessible(true);
}
}
}
| public void | setCallback(android.media.MediaCodec$Callback cb)Sets an asynchronous callback for actionable MediaCodec events.
If the client intends to use the component in asynchronous mode,
a valid callback should be provided before {@link #configure} is called.
When asynchronous callback is enabled, the client should not call
{@link #getInputBuffers}, {@link #getOutputBuffers},
{@link #dequeueInputBuffer(long)} or {@link #dequeueOutputBuffer(BufferInfo, long)}.
Also, {@link #flush} behaves differently in asynchronous mode. After calling
{@code flush}, you must call {@link #start} to "resume" receiving input buffers,
even if an input surface was created.
if (mEventHandler != null) {
// set java callback on handler
Message msg = mEventHandler.obtainMessage(EVENT_SET_CALLBACK, 0, 0, cb);
mEventHandler.sendMessage(msg);
// set native handler here, don't post to handler because
// it may cause the callback to be delayed and set in a wrong state,
// and MediaCodec is already doing it on looper.
native_setCallback(cb);
}
| public final void | setParameters(android.os.Bundle params)Communicate additional parameter changes to the component instance.
if (params == null) {
return;
}
String[] keys = new String[params.size()];
Object[] values = new Object[params.size()];
int i = 0;
for (final String key: params.keySet()) {
keys[i] = key;
values[i] = params.get(key);
++i;
}
setParameters(keys, values);
| private final native void | setParameters(java.lang.String[] keys, java.lang.Object[] values)
| public final native void | setVideoScalingMode(int mode)If a surface has been specified in a previous call to {@link #configure}
specifies the scaling mode to use. The default is "scale to fit".
| public final native void | signalEndOfInputStream()Signals end-of-stream on input. Equivalent to submitting an empty buffer with
{@link #BUFFER_FLAG_END_OF_STREAM} set. This may only be used with
encoders receiving input from a Surface created by {@link #createInputSurface}.
| public final void | start()After successfully configuring the component, call {@code start}.
Call {@code start} also if the codec is configured in asynchronous mode,
and it has just been flushed, to resume requesting input buffers.
native_start();
synchronized(mBufferLock) {
cacheBuffers(true /* input */);
cacheBuffers(false /* input */);
}
| public final void | stop()Finish the decode/encode session, note that the codec instance
remains active and ready to be {@link #start}ed again.
To ensure that it is available to other client call {@link #release}
and don't just rely on garbage collection to eventually do this for you.
native_stop();
freeAllTrackedBuffers();
if (mEventHandler != null) {
mEventHandler.removeMessages(EVENT_CALLBACK);
mEventHandler.removeMessages(EVENT_SET_CALLBACK);
}
| private final void | validateInputByteBuffer(java.nio.ByteBuffer[] buffers, int index)
if (buffers != null && index >= 0 && index < buffers.length) {
ByteBuffer buffer = buffers[index];
if (buffer != null) {
buffer.setAccessible(true);
buffer.clear();
}
}
| private final void | validateOutputByteBuffer(java.nio.ByteBuffer[] buffers, int index, android.media.MediaCodec$BufferInfo info)
if (buffers != null && index >= 0 && index < buffers.length) {
ByteBuffer buffer = buffers[index];
if (buffer != null) {
buffer.setAccessible(true);
buffer.limit(info.offset + info.size).position(info.offset);
}
}
|
|