MediaPlayerpublic class MediaPlayer extends Object implements SubtitleController.ListenerMediaPlayer class can be used to control playback
of audio/video files and streams. An example on how to use the methods in
this class can be found in {@link android.widget.VideoView}.
Topics covered here are:
- State Diagram
- Valid and Invalid States
- Permissions
- Register informational and error callbacks
Developer Guides
For more information about how to use MediaPlayer, read the
Media Playback developer guide.
State Diagram
Playback control of audio/video files and streams is managed as a state
machine. The following diagram shows the life cycle and the states of a
MediaPlayer object driven by the supported playback control operations.
The ovals represent the states a MediaPlayer object may reside
in. The arcs represent the playback control operations that drive the object
state transition. There are two types of arcs. The arcs with a single arrow
head represent synchronous method calls, while those with
a double arrow head represent asynchronous method calls.
From this state diagram, one can see that a MediaPlayer object has the
following states:
- When a MediaPlayer object is just created using
new or
after {@link #reset()} is called, it is in the Idle state; and after
{@link #release()} is called, it is in the End state. Between these
two states is the life cycle of the MediaPlayer object.
- There is a subtle but important difference between a newly constructed
MediaPlayer object and the MediaPlayer object after {@link #reset()}
is called. It is a programming error to invoke methods such
as {@link #getCurrentPosition()},
{@link #getDuration()}, {@link #getVideoHeight()},
{@link #getVideoWidth()}, {@link #setAudioStreamType(int)},
{@link #setLooping(boolean)},
{@link #setVolume(float, float)}, {@link #pause()}, {@link #start()},
{@link #stop()}, {@link #seekTo(int)}, {@link #prepare()} or
{@link #prepareAsync()} in the Idle state for both cases. If any of these
methods is called right after a MediaPlayer object is constructed,
the user supplied callback method OnErrorListener.onError() won't be
called by the internal player engine and the object state remains
unchanged; but if these methods are called right after {@link #reset()},
the user supplied callback method OnErrorListener.onError() will be
invoked by the internal player engine and the object will be
transfered to the Error state.
- It is also recommended that once
a MediaPlayer object is no longer being used, call {@link #release()} immediately
so that resources used by the internal player engine associated with the
MediaPlayer object can be released immediately. Resource may include
singleton resources such as hardware acceleration components and
failure to call {@link #release()} may cause subsequent instances of
MediaPlayer objects to fallback to software implementations or fail
altogether. Once the MediaPlayer
object is in the End state, it can no longer be used and
there is no way to bring it back to any other state.
- Furthermore,
the MediaPlayer objects created using
new is in the
Idle state, while those created with one
of the overloaded convenient create methods are NOT
in the Idle state. In fact, the objects are in the Prepared
state if the creation using create method is successful.
- In general, some playback control operation may fail due to various
reasons, such as unsupported audio/video format, poorly interleaved
audio/video, resolution too high, streaming timeout, and the like.
Thus, error reporting and recovery is an important concern under
these circumstances. Sometimes, due to programming errors, invoking a playback
control operation in an invalid state may also occur. Under all these
error conditions, the internal player engine invokes a user supplied
OnErrorListener.onError() method if an OnErrorListener has been
registered beforehand via
{@link #setOnErrorListener(android.media.MediaPlayer.OnErrorListener)}.
- It is important to note that once an error occurs, the
MediaPlayer object enters the Error state (except as noted
above), even if an error listener has not been registered by the application.
- In order to reuse a MediaPlayer object that is in the
Error state and recover from the error,
{@link #reset()} can be called to restore the object to its Idle
state.
- It is good programming practice to have your application
register a OnErrorListener to look out for error notifications from
the internal player engine.
- IllegalStateException is
thrown to prevent programming errors such as calling {@link #prepare()},
{@link #prepareAsync()}, or one of the overloaded
setDataSource
methods in an invalid state.
- Calling
{@link #setDataSource(FileDescriptor)}, or
{@link #setDataSource(String)}, or
{@link #setDataSource(Context, Uri)}, or
{@link #setDataSource(FileDescriptor, long, long)} transfers a
MediaPlayer object in the Idle state to the
Initialized state.
- An IllegalStateException is thrown if
setDataSource() is called in any other state.
- It is good programming
practice to always look out for
IllegalArgumentException
and IOException that may be thrown from the overloaded
setDataSource methods.
- A MediaPlayer object must first enter the Prepared state
before playback can be started.
- There are two ways (synchronous vs.
asynchronous) that the Prepared state can be reached:
either a call to {@link #prepare()} (synchronous) which
transfers the object to the Prepared state once the method call
returns, or a call to {@link #prepareAsync()} (asynchronous) which
first transfers the object to the Preparing state after the
call returns (which occurs almost right way) while the internal
player engine continues working on the rest of preparation work
until the preparation work completes. When the preparation completes or when {@link #prepare()} call returns,
the internal player engine then calls a user supplied callback method,
onPrepared() of the OnPreparedListener interface, if an
OnPreparedListener is registered beforehand via {@link
#setOnPreparedListener(android.media.MediaPlayer.OnPreparedListener)}.
- It is important to note that
the Preparing state is a transient state, and the behavior
of calling any method with side effect while a MediaPlayer object is
in the Preparing state is undefined.
- An IllegalStateException is
thrown if {@link #prepare()} or {@link #prepareAsync()} is called in
any other state.
- While in the Prepared state, properties
such as audio/sound volume, screenOnWhilePlaying, looping can be
adjusted by invoking the corresponding set methods.
- To start the playback, {@link #start()} must be called. After
{@link #start()} returns successfully, the MediaPlayer object is in the
Started state. {@link #isPlaying()} can be called to test
whether the MediaPlayer object is in the Started state.
- While in the Started state, the internal player engine calls
a user supplied OnBufferingUpdateListener.onBufferingUpdate() callback
method if a OnBufferingUpdateListener has been registered beforehand
via {@link #setOnBufferingUpdateListener(OnBufferingUpdateListener)}.
This callback allows applications to keep track of the buffering status
while streaming audio/video.
- Calling {@link #start()} has not effect
on a MediaPlayer object that is already in the Started state.
- Playback can be paused and stopped, and the current playback position
can be adjusted. Playback can be paused via {@link #pause()}. When the call to
{@link #pause()} returns, the MediaPlayer object enters the
Paused state. Note that the transition from the Started
state to the Paused state and vice versa happens
asynchronously in the player engine. It may take some time before
the state is updated in calls to {@link #isPlaying()}, and it can be
a number of seconds in the case of streamed content.
- Calling {@link #start()} to resume playback for a paused
MediaPlayer object, and the resumed playback
position is the same as where it was paused. When the call to
{@link #start()} returns, the paused MediaPlayer object goes back to
the Started state.
- Calling {@link #pause()} has no effect on
a MediaPlayer object that is already in the Paused state.
- Calling {@link #stop()} stops playback and causes a
MediaPlayer in the Started, Paused, Prepared
or PlaybackCompleted state to enter the
Stopped state.
- Once in the Stopped state, playback cannot be started
until {@link #prepare()} or {@link #prepareAsync()} are called to set
the MediaPlayer object to the Prepared state again.
- Calling {@link #stop()} has no effect on a MediaPlayer
object that is already in the Stopped state.
- The playback position can be adjusted with a call to
{@link #seekTo(int)}.
- Although the asynchronuous {@link #seekTo(int)}
call returns right way, the actual seek operation may take a while to
finish, especially for audio/video being streamed. When the actual
seek operation completes, the internal player engine calls a user
supplied OnSeekComplete.onSeekComplete() if an OnSeekCompleteListener
has been registered beforehand via
{@link #setOnSeekCompleteListener(OnSeekCompleteListener)}.
- Please
note that {@link #seekTo(int)} can also be called in the other states,
such as Prepared, Paused and PlaybackCompleted
state.
- Furthermore, the actual current playback position
can be retrieved with a call to {@link #getCurrentPosition()}, which
is helpful for applications such as a Music player that need to keep
track of the playback progress.
- When the playback reaches the end of stream, the playback completes.
- If the looping mode was being set to truewith
{@link #setLooping(boolean)}, the MediaPlayer object shall remain in
the Started state.
- If the looping mode was set to false
, the player engine calls a user supplied callback method,
OnCompletion.onCompletion(), if a OnCompletionListener is registered
beforehand via {@link #setOnCompletionListener(OnCompletionListener)}.
The invoke of the callback signals that the object is now in the
PlaybackCompleted state.
- While in the PlaybackCompleted
state, calling {@link #start()} can restart the playback from the
beginning of the audio/video source.
Valid and invalid states
Method Name |
Valid Sates |
Invalid States |
Comments |
attachAuxEffect |
{Initialized, Prepared, Started, Paused, Stopped, PlaybackCompleted} |
{Idle, Error} |
This method must be called after setDataSource.
Calling it does not change the object state. |
getAudioSessionId |
any |
{} |
This method can be called in any state and calling it does not change
the object state. |
getCurrentPosition |
{Idle, Initialized, Prepared, Started, Paused, Stopped,
PlaybackCompleted} |
{Error} |
Successful invoke of this method in a valid state does not change the
state. Calling this method in an invalid state transfers the object
to the Error state. |
getDuration |
{Prepared, Started, Paused, Stopped, PlaybackCompleted} |
{Idle, Initialized, Error} |
Successful invoke of this method in a valid state does not change the
state. Calling this method in an invalid state transfers the object
to the Error state. |
getVideoHeight |
{Idle, Initialized, Prepared, Started, Paused, Stopped,
PlaybackCompleted} |
{Error} |
Successful invoke of this method in a valid state does not change the
state. Calling this method in an invalid state transfers the object
to the Error state. |
getVideoWidth |
{Idle, Initialized, Prepared, Started, Paused, Stopped,
PlaybackCompleted} |
{Error} |
Successful invoke of this method in a valid state does not change
the state. Calling this method in an invalid state transfers the
object to the Error state. |
isPlaying |
{Idle, Initialized, Prepared, Started, Paused, Stopped,
PlaybackCompleted} |
{Error} |
Successful invoke of this method in a valid state does not change
the state. Calling this method in an invalid state transfers the
object to the Error state. |
pause |
{Started, Paused, PlaybackCompleted} |
{Idle, Initialized, Prepared, Stopped, Error} |
Successful invoke of this method in a valid state transfers the
object to the Paused state. Calling this method in an
invalid state transfers the object to the Error state. |
prepare |
{Initialized, Stopped} |
{Idle, Prepared, Started, Paused, PlaybackCompleted, Error} |
Successful invoke of this method in a valid state transfers the
object to the Prepared state. Calling this method in an
invalid state throws an IllegalStateException. |
prepareAsync |
{Initialized, Stopped} |
{Idle, Prepared, Started, Paused, PlaybackCompleted, Error} |
Successful invoke of this method in a valid state transfers the
object to the Preparing state. Calling this method in an
invalid state throws an IllegalStateException. |
release |
any |
{} |
After {@link #release()}, the object is no longer available. |
reset |
{Idle, Initialized, Prepared, Started, Paused, Stopped,
PlaybackCompleted, Error} |
{} |
After {@link #reset()}, the object is like being just created. |
seekTo |
{Prepared, Started, Paused, PlaybackCompleted} |
{Idle, Initialized, Stopped, Error} |
Successful invoke of this method in a valid state does not change
the state. Calling this method in an invalid state transfers the
object to the Error state. |
setAudioAttributes |
{Idle, Initialized, Stopped, Prepared, Started, Paused,
PlaybackCompleted} |
{Error} |
Successful invoke of this method does not change the state. In order for the
target audio attributes type to become effective, this method must be called before
prepare() or prepareAsync(). |
setAudioSessionId |
{Idle} |
{Initialized, Prepared, Started, Paused, Stopped, PlaybackCompleted,
Error} |
This method must be called in idle state as the audio session ID must be known before
calling setDataSource. Calling it does not change the object state. |
setAudioStreamType |
{Idle, Initialized, Stopped, Prepared, Started, Paused,
PlaybackCompleted} |
{Error} |
Successful invoke of this method does not change the state. In order for the
target audio stream type to become effective, this method must be called before
prepare() or prepareAsync(). |
setAuxEffectSendLevel |
any |
{} |
Calling this method does not change the object state. |
setDataSource |
{Idle} |
{Initialized, Prepared, Started, Paused, Stopped, PlaybackCompleted,
Error} |
Successful invoke of this method in a valid state transfers the
object to the Initialized state. Calling this method in an
invalid state throws an IllegalStateException. |
setDisplay |
any |
{} |
This method can be called in any state and calling it does not change
the object state. |
setSurface |
any |
{} |
This method can be called in any state and calling it does not change
the object state. |
setVideoScalingMode |
{Initialized, Prepared, Started, Paused, Stopped, PlaybackCompleted} |
{Idle, Error} |
Successful invoke of this method does not change the state. |
setLooping |
{Idle, Initialized, Stopped, Prepared, Started, Paused,
PlaybackCompleted} |
{Error} |
Successful invoke of this method in a valid state does not change
the state. Calling this method in an
invalid state transfers the object to the Error state. |
isLooping |
any |
{} |
This method can be called in any state and calling it does not change
the object state. |
setOnBufferingUpdateListener |
any |
{} |
This method can be called in any state and calling it does not change
the object state. |
setOnCompletionListener |
any |
{} |
This method can be called in any state and calling it does not change
the object state. |
setOnErrorListener |
any |
{} |
This method can be called in any state and calling it does not change
the object state. |
setOnPreparedListener |
any |
{} |
This method can be called in any state and calling it does not change
the object state. |
setOnSeekCompleteListener |
any |
{} |
This method can be called in any state and calling it does not change
the object state. |
setScreenOnWhilePlaying> |
any |
{} |
This method can be called in any state and calling it does not change
the object state. |
setVolume |
{Idle, Initialized, Stopped, Prepared, Started, Paused,
PlaybackCompleted} |
{Error} |
Successful invoke of this method does not change the state.
| setWakeMode |
any |
{} |
This method can be called in any state and calling it does not change
the object state. |
start |
{Prepared, Started, Paused, PlaybackCompleted} |
{Idle, Initialized, Stopped, Error} |
Successful invoke of this method in a valid state transfers the
object to the Started state. Calling this method in an
invalid state transfers the object to the Error state. |
stop |
{Prepared, Started, Stopped, Paused, PlaybackCompleted} |
{Idle, Initialized, Error} |
Successful invoke of this method in a valid state transfers the
object to the Stopped state. Calling this method in an
invalid state transfers the object to the Error state. |
getTrackInfo |
{Prepared, Started, Stopped, Paused, PlaybackCompleted} |
{Idle, Initialized, Error} |
Successful invoke of this method does not change the state. |
addTimedTextSource |
{Prepared, Started, Stopped, Paused, PlaybackCompleted} |
{Idle, Initialized, Error} |
Successful invoke of this method does not change the state. |
selectTrack |
{Prepared, Started, Stopped, Paused, PlaybackCompleted} |
{Idle, Initialized, Error} |
Successful invoke of this method does not change the state. |
deselectTrack |
{Prepared, Started, Stopped, Paused, PlaybackCompleted} |
{Idle, Initialized, Error} |
Successful invoke of this method does not change the state. |
Permissions
One may need to declare a corresponding WAKE_LOCK permission {@link
android.R.styleable#AndroidManifestUsesPermission <uses-permission>}
element.
This class requires the {@link android.Manifest.permission#INTERNET} permission
when used with network-based content.
Callbacks
Applications may want to register for informational and error
events in order to be informed of some internal state update and
possible runtime errors during playback or streaming. Registration for
these events is done by properly setting the appropriate listeners (via calls
to
{@link #setOnPreparedListener(OnPreparedListener)}setOnPreparedListener,
{@link #setOnVideoSizeChangedListener(OnVideoSizeChangedListener)}setOnVideoSizeChangedListener,
{@link #setOnSeekCompleteListener(OnSeekCompleteListener)}setOnSeekCompleteListener,
{@link #setOnCompletionListener(OnCompletionListener)}setOnCompletionListener,
{@link #setOnBufferingUpdateListener(OnBufferingUpdateListener)}setOnBufferingUpdateListener,
{@link #setOnInfoListener(OnInfoListener)}setOnInfoListener,
{@link #setOnErrorListener(OnErrorListener)}setOnErrorListener, etc).
In order to receive the respective callback
associated with these listeners, applications are required to create
MediaPlayer objects on a thread with its own Looper running (main UI
thread by default has a Looper running). |
Fields Summary |
---|
public static final boolean | METADATA_UPDATE_ONLYConstant to retrieve only the new metadata since the last
call.
// FIXME: unhide.
// FIXME: add link to getMetadata(boolean, boolean)
{@hide} | public static final boolean | METADATA_ALLConstant to retrieve all the metadata.
// FIXME: unhide.
// FIXME: add link to getMetadata(boolean, boolean)
{@hide} | public static final boolean | APPLY_METADATA_FILTERConstant to enable the metadata filter during retrieval.
// FIXME: unhide.
// FIXME: add link to getMetadata(boolean, boolean)
{@hide} | public static final boolean | BYPASS_METADATA_FILTERConstant to disable the metadata filter during retrieval.
// FIXME: unhide.
// FIXME: add link to getMetadata(boolean, boolean)
{@hide} | private static final String | TAG | private static final String | IMEDIA_PLAYER | private long | mNativeContext | private long | mNativeSurfaceTexture | private int | mListenerContext | private android.view.SurfaceHolder | mSurfaceHolder | private EventHandler | mEventHandler | private PowerManager.WakeLock | mWakeLock | private boolean | mScreenOnWhilePlaying | private boolean | mStayAwake | private final com.android.internal.app.IAppOpsService | mAppOps | private int | mStreamType | private int | mUsage | private static final int | INVOKE_ID_GET_TRACK_INFO | private static final int | INVOKE_ID_ADD_EXTERNAL_SOURCE | private static final int | INVOKE_ID_ADD_EXTERNAL_SOURCE_FD | private static final int | INVOKE_ID_SELECT_TRACK | private static final int | INVOKE_ID_DESELECT_TRACK | private static final int | INVOKE_ID_SET_VIDEO_SCALE_MODE | private static final int | INVOKE_ID_GET_SELECTED_TRACK | public static final int | VIDEO_SCALING_MODE_SCALE_TO_FITSpecifies a video scaling mode. The content is stretched to the
surface rendering area. When the surface has the same aspect ratio
as the content, the aspect ratio of the content is maintained;
otherwise, the aspect ratio of the content is not maintained when video
is being rendered. Unlike {@link #VIDEO_SCALING_MODE_SCALE_TO_FIT_WITH_CROPPING},
there is no content cropping with this video scaling mode. | public static final int | VIDEO_SCALING_MODE_SCALE_TO_FIT_WITH_CROPPINGSpecifies a video scaling mode. The content is scaled, maintaining
its aspect ratio. The whole surface area is always used. When the
aspect ratio of the content is the same as the surface, no content
is cropped; otherwise, content is cropped to fit the surface. | private static final int | KEY_PARAMETER_AUDIO_ATTRIBUTES | public static final String | MEDIA_MIMETYPE_TEXT_SUBRIPMIME type for SubRip (SRT) container. Used in addTimedTextSource APIs. | public static final String | MEDIA_MIMETYPE_TEXT_VTTMIME type for WebVTT subtitle data. | public static final String | MEDIA_MIMETYPE_TEXT_CEA_608MIME type for CEA-608 closed caption data. | private android.media.SubtitleController | mSubtitleController | private final Object | mInbandSubtitleLock | private SubtitleTrack[] | mInbandSubtitleTracks | private int | mSelectedSubtitleTrackIndex | private Vector | mOutOfBandSubtitleTracks | private Vector | mOpenSubtitleSources | private OnSubtitleDataListener | mSubtitleDataListener | private static final int | MEDIA_NOP | private static final int | MEDIA_PREPARED | private static final int | MEDIA_PLAYBACK_COMPLETE | private static final int | MEDIA_BUFFERING_UPDATE | private static final int | MEDIA_SEEK_COMPLETE | private static final int | MEDIA_SET_VIDEO_SIZE | private static final int | MEDIA_STARTED | private static final int | MEDIA_PAUSED | private static final int | MEDIA_STOPPED | private static final int | MEDIA_SKIPPED | private static final int | MEDIA_TIMED_TEXT | private static final int | MEDIA_ERROR | private static final int | MEDIA_INFO | private static final int | MEDIA_SUBTITLE_DATA | private TimeProvider | mTimeProvider | private OnPreparedListener | mOnPreparedListener | private OnCompletionListener | mOnCompletionListener | private OnBufferingUpdateListener | mOnBufferingUpdateListener | private OnSeekCompleteListener | mOnSeekCompleteListener | private OnVideoSizeChangedListener | mOnVideoSizeChangedListener | private OnTimedTextListener | mOnTimedTextListener | private OnSubtitleDataListener | mOnSubtitleDataListener | public static final int | MEDIA_ERROR_UNKNOWNUnspecified media player error. | public static final int | MEDIA_ERROR_SERVER_DIEDMedia server died. In this case, the application must release the
MediaPlayer object and instantiate a new one. | public static final int | MEDIA_ERROR_NOT_VALID_FOR_PROGRESSIVE_PLAYBACKThe video is streamed and its container is not valid for progressive
playback i.e the video's index (e.g moov atom) is not at the start of the
file. | public static final int | MEDIA_ERROR_IOFile or network related operation errors. | public static final int | MEDIA_ERROR_MALFORMEDBitstream is not conforming to the related coding standard or file spec. | public static final int | MEDIA_ERROR_UNSUPPORTEDBitstream is conforming to the related coding standard or file spec, but
the media framework does not support the feature. | public static final int | MEDIA_ERROR_TIMED_OUTSome operation takes too long to complete, usually more than 3-5 seconds. | private OnErrorListener | mOnErrorListener | public static final int | MEDIA_INFO_UNKNOWNUnspecified media player info. | public static final int | MEDIA_INFO_STARTED_AS_NEXTThe player was started because it was used as the next player for another
player, which just completed playback. | public static final int | MEDIA_INFO_VIDEO_RENDERING_STARTThe player just pushed the very first video frame for rendering. | public static final int | MEDIA_INFO_VIDEO_TRACK_LAGGINGThe video is too complex for the decoder: it can't decode frames fast
enough. Possibly only the audio plays fine at this stage. | public static final int | MEDIA_INFO_BUFFERING_STARTMediaPlayer is temporarily pausing playback internally in order to
buffer more data. | public static final int | MEDIA_INFO_BUFFERING_ENDMediaPlayer is resuming playback after filling buffers. | public static final int | MEDIA_INFO_BAD_INTERLEAVINGBad interleaving means that a media has been improperly interleaved or
not interleaved at all, e.g has all the video samples first then all the
audio ones. Video is playing but a lot of disk seeks may be happening. | public static final int | MEDIA_INFO_NOT_SEEKABLEThe media cannot be seeked (e.g live stream) | public static final int | MEDIA_INFO_METADATA_UPDATEA new set of metadata is available. | public static final int | MEDIA_INFO_EXTERNAL_METADATA_UPDATEA new set of external-only metadata is available. Used by
JAVA framework to avoid triggering track scanning. | public static final int | MEDIA_INFO_TIMED_TEXT_ERRORFailed to handle timed text track properly. | public static final int | MEDIA_INFO_UNSUPPORTED_SUBTITLESubtitle track was not supported by the media framework. | public static final int | MEDIA_INFO_SUBTITLE_TIMED_OUTReading the subtitle track takes too long. | private OnInfoListener | mOnInfoListener |
Constructors Summary |
---|
public MediaPlayer()Default constructor. Consider using one of the create() methods for
synchronously instantiating a MediaPlayer from a Uri or resource.
When done with the MediaPlayer, you should call {@link #release()},
to free the resources. If not released, too many MediaPlayer instances may
result in an exception.
Looper looper;
if ((looper = Looper.myLooper()) != null) {
mEventHandler = new EventHandler(this, looper);
} else if ((looper = Looper.getMainLooper()) != null) {
mEventHandler = new EventHandler(this, looper);
} else {
mEventHandler = null;
}
mTimeProvider = new TimeProvider(this);
mOutOfBandSubtitleTracks = new Vector<SubtitleTrack>();
mOpenSubtitleSources = new Vector<InputStream>();
mInbandSubtitleTracks = new SubtitleTrack[0];
IBinder b = ServiceManager.getService(Context.APP_OPS_SERVICE);
mAppOps = IAppOpsService.Stub.asInterface(b);
/* Native setup requires a weak reference to our object.
* It's easier to create it here than in C++.
*/
native_setup(new WeakReference<MediaPlayer>(this));
|
Methods Summary |
---|
private native int | _getAudioStreamType()
| private native void | _pause()
| private native void | _prepare()
| private native void | _release()
| private native void | _reset()
| private native void | _setAudioStreamType(int streamtype)
| private native void | _setAuxEffectSendLevel(float level)
| private native void | _setDataSource(java.io.FileDescriptor fd, long offset, long length)
| private native void | _setVideoSurface(android.view.Surface surface)
| private native void | _setVolume(float leftVolume, float rightVolume)
| private native void | _start()
| private native void | _stop()
| public void | addSubtitleSource(java.io.InputStream is, android.media.MediaFormat format)
final InputStream fIs = is;
final MediaFormat fFormat = format;
// Ensure all input streams are closed. It is also a handy
// way to implement timeouts in the future.
synchronized(mOpenSubtitleSources) {
mOpenSubtitleSources.add(is);
}
// process each subtitle in its own thread
final HandlerThread thread = new HandlerThread("SubtitleReadThread",
Process.THREAD_PRIORITY_BACKGROUND + Process.THREAD_PRIORITY_MORE_FAVORABLE);
thread.start();
Handler handler = new Handler(thread.getLooper());
handler.post(new Runnable() {
private int addTrack() {
if (fIs == null || mSubtitleController == null) {
return MEDIA_INFO_UNSUPPORTED_SUBTITLE;
}
SubtitleTrack track = mSubtitleController.addTrack(fFormat);
if (track == null) {
return MEDIA_INFO_UNSUPPORTED_SUBTITLE;
}
// TODO: do the conversion in the subtitle track
Scanner scanner = new Scanner(fIs, "UTF-8");
String contents = scanner.useDelimiter("\\A").next();
synchronized(mOpenSubtitleSources) {
mOpenSubtitleSources.remove(fIs);
}
scanner.close();
mOutOfBandSubtitleTracks.add(track);
track.onData(contents.getBytes(), true /* eos */, ~0 /* runID: keep forever */);
return MEDIA_INFO_EXTERNAL_METADATA_UPDATE;
}
public void run() {
int res = addTrack();
if (mEventHandler != null) {
Message m = mEventHandler.obtainMessage(MEDIA_INFO, res, 0, null);
mEventHandler.sendMessage(m);
}
thread.getLooper().quitSafely();
}
});
| public void | addTimedTextSource(java.lang.String path, java.lang.String mimeType)Adds an external timed text source file.
Currently supported format is SubRip with the file extension .srt, case insensitive.
Note that a single external timed text source may contain multiple tracks in it.
One can find the total number of available tracks using {@link #getTrackInfo()} to see what
additional tracks become available after this method call.
if (!availableMimeTypeForExternalSource(mimeType)) {
final String msg = "Illegal mimeType for timed text source: " + mimeType;
throw new IllegalArgumentException(msg);
}
File file = new File(path);
if (file.exists()) {
FileInputStream is = new FileInputStream(file);
FileDescriptor fd = is.getFD();
addTimedTextSource(fd, mimeType);
is.close();
} else {
// We do not support the case where the path is not a file.
throw new IOException(path);
}
| public void | addTimedTextSource(android.content.Context context, android.net.Uri uri, java.lang.String mimeType)Adds an external timed text source file (Uri).
Currently supported format is SubRip with the file extension .srt, case insensitive.
Note that a single external timed text source may contain multiple tracks in it.
One can find the total number of available tracks using {@link #getTrackInfo()} to see what
additional tracks become available after this method call.
String scheme = uri.getScheme();
if(scheme == null || scheme.equals("file")) {
addTimedTextSource(uri.getPath(), mimeType);
return;
}
AssetFileDescriptor fd = null;
try {
ContentResolver resolver = context.getContentResolver();
fd = resolver.openAssetFileDescriptor(uri, "r");
if (fd == null) {
return;
}
addTimedTextSource(fd.getFileDescriptor(), mimeType);
return;
} catch (SecurityException ex) {
} catch (IOException ex) {
} finally {
if (fd != null) {
fd.close();
}
}
| public void | addTimedTextSource(java.io.FileDescriptor fd, java.lang.String mimeType)Adds an external timed text source file (FileDescriptor).
It is the caller's responsibility to close the file descriptor.
It is safe to do so as soon as this call returns.
Currently supported format is SubRip. Note that a single external timed text source may
contain multiple tracks in it. One can find the total number of available tracks
using {@link #getTrackInfo()} to see what additional tracks become available
after this method call.
// intentionally less than LONG_MAX
addTimedTextSource(fd, 0, 0x7ffffffffffffffL, mimeType);
| public void | addTimedTextSource(java.io.FileDescriptor fd, long offset, long length, java.lang.String mime)Adds an external timed text file (FileDescriptor).
It is the caller's responsibility to close the file descriptor.
It is safe to do so as soon as this call returns.
Currently supported format is SubRip. Note that a single external timed text source may
contain multiple tracks in it. One can find the total number of available tracks
using {@link #getTrackInfo()} to see what additional tracks become available
after this method call.
if (!availableMimeTypeForExternalSource(mime)) {
throw new IllegalArgumentException("Illegal mimeType for timed text source: " + mime);
}
FileDescriptor fd2;
try {
fd2 = Libcore.os.dup(fd);
} catch (ErrnoException ex) {
Log.e(TAG, ex.getMessage(), ex);
throw new RuntimeException(ex);
}
final MediaFormat fFormat = new MediaFormat();
fFormat.setString(MediaFormat.KEY_MIME, mime);
fFormat.setInteger(MediaFormat.KEY_IS_TIMED_TEXT, 1);
Context context = ActivityThread.currentApplication();
// A MediaPlayer created by a VideoView should already have its mSubtitleController set.
if (mSubtitleController == null) {
mSubtitleController = new SubtitleController(context, mTimeProvider, this);
mSubtitleController.setAnchor(new Anchor() {
@Override
public void setSubtitleWidget(RenderingWidget subtitleWidget) {
}
@Override
public Looper getSubtitleLooper() {
return Looper.getMainLooper();
}
});
}
if (!mSubtitleController.hasRendererFor(fFormat)) {
// test and add not atomic
mSubtitleController.registerRenderer(new SRTRenderer(context, mEventHandler));
}
final SubtitleTrack track = mSubtitleController.addTrack(fFormat);
mOutOfBandSubtitleTracks.add(track);
final FileDescriptor fd3 = fd2;
final long offset2 = offset;
final long length2 = length;
final HandlerThread thread = new HandlerThread(
"TimedTextReadThread",
Process.THREAD_PRIORITY_BACKGROUND + Process.THREAD_PRIORITY_MORE_FAVORABLE);
thread.start();
Handler handler = new Handler(thread.getLooper());
handler.post(new Runnable() {
private int addTrack() {
InputStream is = null;
final ByteArrayOutputStream bos = new ByteArrayOutputStream();
try {
Libcore.os.lseek(fd3, offset2, OsConstants.SEEK_SET);
byte[] buffer = new byte[4096];
for (long total = 0; total < length2;) {
int bytesToRead = (int) Math.min(buffer.length, length2 - total);
int bytes = IoBridge.read(fd3, buffer, 0, bytesToRead);
if (bytes < 0) {
break;
} else {
bos.write(buffer, 0, bytes);
total += bytes;
}
}
track.onData(bos.toByteArray(), true /* eos */, ~0 /* runID: keep forever */);
return MEDIA_INFO_EXTERNAL_METADATA_UPDATE;
} catch (Exception e) {
Log.e(TAG, e.getMessage(), e);
return MEDIA_INFO_TIMED_TEXT_ERROR;
} finally {
if (is != null) {
try {
is.close();
} catch (IOException e) {
Log.e(TAG, e.getMessage(), e);
}
}
}
}
public void run() {
int res = addTrack();
if (mEventHandler != null) {
Message m = mEventHandler.obtainMessage(MEDIA_INFO, res, 0, null);
mEventHandler.sendMessage(m);
}
thread.getLooper().quitSafely();
}
});
| public native void | attachAuxEffect(int effectId)Attaches an auxiliary effect to the player. A typical auxiliary effect is a reverberation
effect which can be applied on any sound source that directs a certain amount of its
energy to this effect. This amount is defined by setAuxEffectSendLevel().
See {@link #setAuxEffectSendLevel(float)}.
After creating an auxiliary effect (e.g.
{@link android.media.audiofx.EnvironmentalReverb}), retrieve its ID with
{@link android.media.audiofx.AudioEffect#getId()} and use it when calling this method
to attach the player to the effect.
To detach the effect from the player, call this method with a null effect id.
This method must be called after one of the overloaded setDataSource
methods.
| private static boolean | availableMimeTypeForExternalSource(java.lang.String mimeType)
/*
* A helper function to check if the mime type is supported by media framework.
*/
if (MEDIA_MIMETYPE_TEXT_SUBRIP.equals(mimeType)) {
return true;
}
return false;
| public static android.media.MediaPlayer | create(android.content.Context context, android.net.Uri uri, android.view.SurfaceHolder holder, AudioAttributes audioAttributes, int audioSessionId)Same factory method as {@link #create(Context, Uri, SurfaceHolder)} but that lets you specify
the audio attributes and session ID to be used by the new MediaPlayer instance.
try {
MediaPlayer mp = new MediaPlayer();
final AudioAttributes aa = audioAttributes != null ? audioAttributes :
new AudioAttributes.Builder().build();
mp.setAudioAttributes(aa);
mp.setAudioSessionId(audioSessionId);
mp.setDataSource(context, uri);
if (holder != null) {
mp.setDisplay(holder);
}
mp.prepare();
return mp;
} catch (IOException ex) {
Log.d(TAG, "create failed:", ex);
// fall through
} catch (IllegalArgumentException ex) {
Log.d(TAG, "create failed:", ex);
// fall through
} catch (SecurityException ex) {
Log.d(TAG, "create failed:", ex);
// fall through
}
return null;
| public static android.media.MediaPlayer | create(android.content.Context context, int resid)Convenience method to create a MediaPlayer for a given resource id.
On success, {@link #prepare()} will already have been called and must not be called again.
When done with the MediaPlayer, you should call {@link #release()},
to free the resources. If not released, too many MediaPlayer instances will
result in an exception.
Note that since {@link #prepare()} is called automatically in this method,
you cannot change the audio stream type (see {@link #setAudioStreamType(int)}), audio
session ID (see {@link #setAudioSessionId(int)}) or audio attributes
(see {@link #setAudioAttributes(AudioAttributes)} of the new MediaPlayer.
int s = AudioSystem.newAudioSessionId();
return create(context, resid, null, s > 0 ? s : 0);
| public static android.media.MediaPlayer | create(android.content.Context context, int resid, AudioAttributes audioAttributes, int audioSessionId)Same factory method as {@link #create(Context, int)} but that lets you specify the audio
attributes and session ID to be used by the new MediaPlayer instance.
try {
AssetFileDescriptor afd = context.getResources().openRawResourceFd(resid);
if (afd == null) return null;
MediaPlayer mp = new MediaPlayer();
final AudioAttributes aa = audioAttributes != null ? audioAttributes :
new AudioAttributes.Builder().build();
mp.setAudioAttributes(aa);
mp.setAudioSessionId(audioSessionId);
mp.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength());
afd.close();
mp.prepare();
return mp;
} catch (IOException ex) {
Log.d(TAG, "create failed:", ex);
// fall through
} catch (IllegalArgumentException ex) {
Log.d(TAG, "create failed:", ex);
// fall through
} catch (SecurityException ex) {
Log.d(TAG, "create failed:", ex);
// fall through
}
return null;
| public static android.media.MediaPlayer | create(android.content.Context context, android.net.Uri uri)Convenience method to create a MediaPlayer for a given Uri.
On success, {@link #prepare()} will already have been called and must not be called again.
When done with the MediaPlayer, you should call {@link #release()},
to free the resources. If not released, too many MediaPlayer instances will
result in an exception.
Note that since {@link #prepare()} is called automatically in this method,
you cannot change the audio stream type (see {@link #setAudioStreamType(int)}), audio
session ID (see {@link #setAudioSessionId(int)}) or audio attributes
(see {@link #setAudioAttributes(AudioAttributes)} of the new MediaPlayer.
return create (context, uri, null);
| public static android.media.MediaPlayer | create(android.content.Context context, android.net.Uri uri, android.view.SurfaceHolder holder)Convenience method to create a MediaPlayer for a given Uri.
On success, {@link #prepare()} will already have been called and must not be called again.
When done with the MediaPlayer, you should call {@link #release()},
to free the resources. If not released, too many MediaPlayer instances will
result in an exception.
Note that since {@link #prepare()} is called automatically in this method,
you cannot change the audio stream type (see {@link #setAudioStreamType(int)}), audio
session ID (see {@link #setAudioSessionId(int)}) or audio attributes
(see {@link #setAudioAttributes(AudioAttributes)} of the new MediaPlayer.
int s = AudioSystem.newAudioSessionId();
return create(context, uri, holder, null, s > 0 ? s : 0);
| public void | deselectTrack(int index)Deselect a track.
Currently, the track must be a timed text track and no audio or video tracks can be
deselected. If the timed text track identified by index has not been
selected before, it throws an exception.
selectOrDeselectTrack(index, false /* select */);
| protected void | finalize() native_finalize();
| public native int | getAudioSessionId()Returns the audio session ID.
| private int | getAudioStreamType()
if (mStreamType == AudioManager.USE_DEFAULT_STREAM_TYPE) {
mStreamType = _getAudioStreamType();
}
return mStreamType;
| public native int | getCurrentPosition()Gets the current playback position.
| public native int | getDuration()Gets the duration of the file.
| private android.media.MediaPlayer$TrackInfo[] | getInbandTrackInfo()
Parcel request = Parcel.obtain();
Parcel reply = Parcel.obtain();
try {
request.writeInterfaceToken(IMEDIA_PLAYER);
request.writeInt(INVOKE_ID_GET_TRACK_INFO);
invoke(request, reply);
TrackInfo trackInfo[] = reply.createTypedArray(TrackInfo.CREATOR);
return trackInfo;
} finally {
request.recycle();
reply.recycle();
}
| public android.media.MediaTimeProvider | getMediaTimeProvider()
if (mTimeProvider == null) {
mTimeProvider = new TimeProvider(this);
}
return mTimeProvider;
| public Metadata | getMetadata(boolean update_only, boolean apply_filter)Gets the media metadata.
Parcel reply = Parcel.obtain();
Metadata data = new Metadata();
if (!native_getMetadata(update_only, apply_filter, reply)) {
reply.recycle();
return null;
}
// Metadata takes over the parcel, don't recycle it unless
// there is an error.
if (!data.parse(reply)) {
reply.recycle();
return null;
}
return data;
| public int | getSelectedTrack(int trackType)Returns the index of the audio, video, or subtitle track currently selected for playback,
The return value is an index into the array returned by {@link #getTrackInfo()}, and can
be used in calls to {@link #selectTrack(int)} or {@link #deselectTrack(int)}.
if (trackType == TrackInfo.MEDIA_TRACK_TYPE_SUBTITLE && mSubtitleController != null) {
SubtitleTrack subtitleTrack = mSubtitleController.getSelectedTrack();
if (subtitleTrack != null) {
int index = mOutOfBandSubtitleTracks.indexOf(subtitleTrack);
if (index >= 0) {
return mInbandSubtitleTracks.length + index;
}
}
}
Parcel request = Parcel.obtain();
Parcel reply = Parcel.obtain();
try {
request.writeInterfaceToken(IMEDIA_PLAYER);
request.writeInt(INVOKE_ID_GET_SELECTED_TRACK);
request.writeInt(trackType);
invoke(request, reply);
int selectedTrack = reply.readInt();
return selectedTrack;
} finally {
request.recycle();
reply.recycle();
}
| public android.media.MediaPlayer$TrackInfo[] | getTrackInfo()Returns an array of track information.
TrackInfo trackInfo[] = getInbandTrackInfo();
// add out-of-band tracks
TrackInfo allTrackInfo[] = new TrackInfo[trackInfo.length + mOutOfBandSubtitleTracks.size()];
System.arraycopy(trackInfo, 0, allTrackInfo, 0, trackInfo.length);
int i = trackInfo.length;
for (SubtitleTrack track: mOutOfBandSubtitleTracks) {
int type = track.isTimedText()
? TrackInfo.MEDIA_TRACK_TYPE_TIMEDTEXT
: TrackInfo.MEDIA_TRACK_TYPE_SUBTITLE;
allTrackInfo[i] = new TrackInfo(type, track.getFormat());
++i;
}
return allTrackInfo;
| public native int | getVideoHeight()Returns the height of the video.
| public native int | getVideoWidth()Returns the width of the video.
| public void | invoke(android.os.Parcel request, android.os.Parcel reply)Invoke a generic method on the native player using opaque
parcels for the request and reply. Both payloads' format is a
convention between the java caller and the native player.
Must be called after setDataSource to make sure a native player
exists. On failure, a RuntimeException is thrown.
int retcode = native_invoke(request, reply);
reply.setDataPosition(0);
if (retcode != 0) {
throw new RuntimeException("failure code: " + retcode);
}
| public native boolean | isLooping()Checks whether the MediaPlayer is looping or non-looping.
| public native boolean | isPlaying()Checks whether the MediaPlayer is playing.
| private boolean | isRestricted()
try {
final int usage = mUsage != -1 ? mUsage
: AudioAttributes.usageForLegacyStreamType(getAudioStreamType());
final int mode = mAppOps.checkAudioOperation(AppOpsManager.OP_PLAY_AUDIO, usage,
Process.myUid(), ActivityThread.currentPackageName());
return mode != AppOpsManager.MODE_ALLOWED;
} catch (RemoteException e) {
return false;
}
| private boolean | isVideoScalingModeSupported(int mode)
return (mode == VIDEO_SCALING_MODE_SCALE_TO_FIT ||
mode == VIDEO_SCALING_MODE_SCALE_TO_FIT_WITH_CROPPING);
| private native void | nativeSetDataSource(android.os.IBinder httpServiceBinder, java.lang.String path, java.lang.String[] keys, java.lang.String[] values)
| private final native void | native_finalize()
| private final native boolean | native_getMetadata(boolean update_only, boolean apply_filter, android.os.Parcel reply)
| private static final native void | native_init()
| private final native int | native_invoke(android.os.Parcel request, android.os.Parcel reply)
| public static native int | native_pullBatteryData(android.os.Parcel reply)
| private final native int | native_setMetadataFilter(android.os.Parcel request)
| private final native int | native_setRetransmitEndpoint(java.lang.String addrString, int port)
| private final native void | native_setup(java.lang.Object mediaplayer_this)
| public android.os.Parcel | newRequest()Create a request parcel which can be routed to the native media
player using {@link #invoke(Parcel, Parcel)}. The Parcel
returned has the proper InterfaceToken set. The caller should
not overwrite that token, i.e it can only append data to the
Parcel.
Parcel parcel = Parcel.obtain();
parcel.writeInterfaceToken(IMEDIA_PLAYER);
return parcel;
| public void | onSubtitleTrackSelected(SubtitleTrack track)
if (mSelectedSubtitleTrackIndex >= 0) {
try {
selectOrDeselectInbandTrack(mSelectedSubtitleTrackIndex, false);
} catch (IllegalStateException e) {
}
mSelectedSubtitleTrackIndex = -1;
}
setOnSubtitleDataListener(null);
if (track == null) {
return;
}
for (int i = 0; i < mInbandSubtitleTracks.length; i++) {
if (mInbandSubtitleTracks[i] == track) {
Log.v(TAG, "Selecting subtitle track " + i);
mSelectedSubtitleTrackIndex = i;
try {
selectOrDeselectInbandTrack(mSelectedSubtitleTrackIndex, true);
} catch (IllegalStateException e) {
}
setOnSubtitleDataListener(mSubtitleDataListener);
break;
}
}
// no need to select out-of-band tracks
| public void | pause()Pauses playback. Call start() to resume.
stayAwake(false);
_pause();
| private static void | postEventFromNative(java.lang.Object mediaplayer_ref, int what, int arg1, int arg2, java.lang.Object obj)
MediaPlayer mp = (MediaPlayer)((WeakReference)mediaplayer_ref).get();
if (mp == null) {
return;
}
if (what == MEDIA_INFO && arg1 == MEDIA_INFO_STARTED_AS_NEXT) {
// this acquires the wakelock if needed, and sets the client side state
mp.start();
}
if (mp.mEventHandler != null) {
Message m = mp.mEventHandler.obtainMessage(what, arg1, arg2, obj);
mp.mEventHandler.sendMessage(m);
}
| public void | prepare()Prepares the player for playback, synchronously.
After setting the datasource and the display surface, you need to either
call prepare() or prepareAsync(). For files, it is OK to call prepare(),
which blocks until MediaPlayer is ready for playback.
_prepare();
scanInternalSubtitleTracks();
| public native void | prepareAsync()Prepares the player for playback, asynchronously.
After setting the datasource and the display surface, you need to either
call prepare() or prepareAsync(). For streams, you should call prepareAsync(),
which returns immediately, rather than blocking until enough data has been
buffered.
| public void | release()Releases resources associated with this MediaPlayer object.
It is considered good practice to call this method when you're
done using the MediaPlayer. In particular, whenever an Activity
of an application is paused (its onPause() method is called),
or stopped (its onStop() method is called), this method should be
invoked to release the MediaPlayer object, unless the application
has a special need to keep the object around. In addition to
unnecessary resources (such as memory and instances of codecs)
being held, failure to call this method immediately if a
MediaPlayer object is no longer needed may also lead to
continuous battery consumption for mobile devices, and playback
failure for other applications if no multiple instances of the
same codec are supported on a device. Even if multiple instances
of the same codec are supported, some performance degradation
may be expected when unnecessary multiple instances are used
at the same time.
stayAwake(false);
updateSurfaceScreenOn();
mOnPreparedListener = null;
mOnBufferingUpdateListener = null;
mOnCompletionListener = null;
mOnSeekCompleteListener = null;
mOnErrorListener = null;
mOnInfoListener = null;
mOnVideoSizeChangedListener = null;
mOnTimedTextListener = null;
if (mTimeProvider != null) {
mTimeProvider.close();
mTimeProvider = null;
}
mOnSubtitleDataListener = null;
_release();
| public void | reset()Resets the MediaPlayer to its uninitialized state. After calling
this method, you will have to initialize it again by setting the
data source and calling prepare().
mSelectedSubtitleTrackIndex = -1;
synchronized(mOpenSubtitleSources) {
for (final InputStream is: mOpenSubtitleSources) {
try {
is.close();
} catch (IOException e) {
}
}
mOpenSubtitleSources.clear();
}
mOutOfBandSubtitleTracks.clear();
mInbandSubtitleTracks = new SubtitleTrack[0];
if (mSubtitleController != null) {
mSubtitleController.reset();
}
if (mTimeProvider != null) {
mTimeProvider.close();
mTimeProvider = null;
}
stayAwake(false);
_reset();
// make sure none of the listeners get called anymore
if (mEventHandler != null) {
mEventHandler.removeCallbacksAndMessages(null);
}
| private void | scanInternalSubtitleTracks()
if (mSubtitleController == null) {
Log.e(TAG, "Should have subtitle controller already set");
return;
}
TrackInfo[] tracks = getInbandTrackInfo();
synchronized (mInbandSubtitleLock) {
SubtitleTrack[] inbandTracks = new SubtitleTrack[tracks.length];
for (int i=0; i < tracks.length; i++) {
if (tracks[i].getTrackType() == TrackInfo.MEDIA_TRACK_TYPE_SUBTITLE) {
if (i < mInbandSubtitleTracks.length) {
inbandTracks[i] = mInbandSubtitleTracks[i];
} else {
SubtitleTrack track = mSubtitleController.addTrack(
tracks[i].getFormat());
inbandTracks[i] = track;
}
}
}
mInbandSubtitleTracks = inbandTracks;
}
mSubtitleController.selectDefaultTrack();
| public native void | seekTo(int msec)Seeks to specified time position.
| private void | selectOrDeselectInbandTrack(int index, boolean select)
Parcel request = Parcel.obtain();
Parcel reply = Parcel.obtain();
try {
request.writeInterfaceToken(IMEDIA_PLAYER);
request.writeInt(select? INVOKE_ID_SELECT_TRACK: INVOKE_ID_DESELECT_TRACK);
request.writeInt(index);
invoke(request, reply);
} finally {
request.recycle();
reply.recycle();
}
| private void | selectOrDeselectTrack(int index, boolean select)
// handle subtitle track through subtitle controller
SubtitleTrack track = null;
synchronized (mInbandSubtitleLock) {
if (mInbandSubtitleTracks.length == 0) {
TrackInfo[] tracks = getInbandTrackInfo();
mInbandSubtitleTracks = new SubtitleTrack[tracks.length];
for (int i=0; i < tracks.length; i++) {
if (tracks[i].getTrackType() == TrackInfo.MEDIA_TRACK_TYPE_SUBTITLE) {
mInbandSubtitleTracks[i] = mSubtitleController.addTrack(tracks[i].getFormat());
}
}
}
}
if (index < mInbandSubtitleTracks.length) {
track = mInbandSubtitleTracks[index];
} else if (index < mInbandSubtitleTracks.length + mOutOfBandSubtitleTracks.size()) {
track = mOutOfBandSubtitleTracks.get(index - mInbandSubtitleTracks.length);
}
if (mSubtitleController != null && track != null) {
if (select) {
if (track.isTimedText()) {
int ttIndex = getSelectedTrack(TrackInfo.MEDIA_TRACK_TYPE_TIMEDTEXT);
if (ttIndex >= 0 && ttIndex < mInbandSubtitleTracks.length) {
// deselect inband counterpart
selectOrDeselectInbandTrack(ttIndex, false);
}
}
mSubtitleController.selectTrack(track);
} else if (mSubtitleController.getSelectedTrack() == track) {
mSubtitleController.selectTrack(null);
} else {
Log.w(TAG, "trying to deselect track that was not selected");
}
return;
}
selectOrDeselectInbandTrack(index, select);
| public void | selectTrack(int index)Selects a track.
If a MediaPlayer is in invalid state, it throws an IllegalStateException exception.
If a MediaPlayer is in Started state, the selected track is presented immediately.
If a MediaPlayer is not in Started state, it just marks the track to be played.
In any valid state, if it is called multiple times on the same type of track (ie. Video,
Audio, Timed Text), the most recent one will be chosen.
The first audio and video tracks are selected by default if available, even though
this method is not called. However, no timed text track will be selected until
this function is called.
Currently, only timed text tracks or audio tracks can be selected via this method.
In addition, the support for selecting an audio track at runtime is pretty limited
in that an audio track can only be selected in the Prepared state.
selectOrDeselectTrack(index, true /* select */);
| public void | setAudioAttributes(AudioAttributes attributes)Sets the audio attributes for this MediaPlayer.
See {@link AudioAttributes} for how to build and configure an instance of this class.
You must call this method before {@link #prepare()} or {@link #prepareAsync()} in order
for the audio attributes to become effective thereafter.
if (attributes == null) {
final String msg = "Cannot set AudioAttributes to null";
throw new IllegalArgumentException(msg);
}
mUsage = attributes.getUsage();
Parcel pattributes = Parcel.obtain();
attributes.writeToParcel(pattributes, AudioAttributes.FLATTEN_TAGS);
setParameter(KEY_PARAMETER_AUDIO_ATTRIBUTES, pattributes);
pattributes.recycle();
| public native void | setAudioSessionId(int sessionId)Sets the audio session ID.
| public void | setAudioStreamType(int streamtype)Sets the audio stream type for this MediaPlayer. See {@link AudioManager}
for a list of stream types. Must call this method before prepare() or
prepareAsync() in order for the target stream type to become effective
thereafter.
_setAudioStreamType(streamtype);
mStreamType = streamtype;
| public void | setAuxEffectSendLevel(float level)Sets the send level of the player to the attached auxiliary effect.
See {@link #attachAuxEffect(int)}. The level value range is 0 to 1.0.
By default the send level is 0, so even if an effect is attached to the player
this method must be called for the effect to be applied.
Note that the passed level value is a raw scalar. UI controls should be scaled
logarithmically: the gain applied by audio framework ranges from -72dB to 0dB,
so an appropriate conversion from linear UI input x to level is:
x == 0 -> level = 0
0 < x <= R -> level = 10^(72*(x-R)/20/R)
if (isRestricted()) {
return;
}
_setAuxEffectSendLevel(level);
| public void | setDataSource(android.content.Context context, android.net.Uri uri)Sets the data source as a content Uri.
setDataSource(context, uri, null);
| public void | setDataSource(android.content.Context context, android.net.Uri uri, java.util.Map headers)Sets the data source as a content Uri.
final String scheme = uri.getScheme();
if (ContentResolver.SCHEME_FILE.equals(scheme)) {
setDataSource(uri.getPath());
return;
} else if (ContentResolver.SCHEME_CONTENT.equals(scheme)
&& Settings.AUTHORITY.equals(uri.getAuthority())) {
// Redirect ringtones to go directly to underlying provider
uri = RingtoneManager.getActualDefaultRingtoneUri(context,
RingtoneManager.getDefaultType(uri));
if (uri == null) {
throw new FileNotFoundException("Failed to resolve default ringtone");
}
}
AssetFileDescriptor fd = null;
try {
ContentResolver resolver = context.getContentResolver();
fd = resolver.openAssetFileDescriptor(uri, "r");
if (fd == null) {
return;
}
// Note: using getDeclaredLength so that our behavior is the same
// as previous versions when the content provider is returning
// a full file.
if (fd.getDeclaredLength() < 0) {
setDataSource(fd.getFileDescriptor());
} else {
setDataSource(fd.getFileDescriptor(), fd.getStartOffset(), fd.getDeclaredLength());
}
return;
} catch (SecurityException ex) {
} catch (IOException ex) {
} finally {
if (fd != null) {
fd.close();
}
}
Log.d(TAG, "Couldn't open file on client side, trying server side");
setDataSource(uri.toString(), headers);
| public void | setDataSource(java.lang.String path)Sets the data source (file-path or http/rtsp URL) to use.
setDataSource(path, null, null);
| public void | setDataSource(java.lang.String path, java.util.Map headers)Sets the data source (file-path or http/rtsp URL) to use.
String[] keys = null;
String[] values = null;
if (headers != null) {
keys = new String[headers.size()];
values = new String[headers.size()];
int i = 0;
for (Map.Entry<String, String> entry: headers.entrySet()) {
keys[i] = entry.getKey();
values[i] = entry.getValue();
++i;
}
}
setDataSource(path, keys, values);
| private void | setDataSource(java.lang.String path, java.lang.String[] keys, java.lang.String[] values)
final Uri uri = Uri.parse(path);
final String scheme = uri.getScheme();
if ("file".equals(scheme)) {
path = uri.getPath();
} else if (scheme != null) {
// handle non-file sources
nativeSetDataSource(
MediaHTTPService.createHttpServiceBinderIfNecessary(path),
path,
keys,
values);
return;
}
final File file = new File(path);
if (file.exists()) {
FileInputStream is = new FileInputStream(file);
FileDescriptor fd = is.getFD();
setDataSource(fd);
is.close();
} else {
throw new IOException("setDataSource failed.");
}
| public void | setDataSource(java.io.FileDescriptor fd)Sets the data source (FileDescriptor) to use. It is the caller's responsibility
to close the file descriptor. It is safe to do so as soon as this call returns.
// intentionally less than LONG_MAX
setDataSource(fd, 0, 0x7ffffffffffffffL);
| public void | setDataSource(java.io.FileDescriptor fd, long offset, long length)Sets the data source (FileDescriptor) to use. The FileDescriptor must be
seekable (N.B. a LocalSocket is not seekable). It is the caller's responsibility
to close the file descriptor. It is safe to do so as soon as this call returns.
_setDataSource(fd, offset, length);
| public void | setDisplay(android.view.SurfaceHolder sh)Sets the {@link SurfaceHolder} to use for displaying the video
portion of the media.
Either a surface holder or surface must be set if a display or video sink
is needed. Not calling this method or {@link #setSurface(Surface)}
when playing back a video will result in only the audio track being played.
A null surface holder or surface will result in only the audio track being
played.
mSurfaceHolder = sh;
Surface surface;
if (sh != null) {
surface = sh.getSurface();
} else {
surface = null;
}
_setVideoSurface(surface);
updateSurfaceScreenOn();
| public native void | setLooping(boolean looping)Sets the player to be looping or non-looping.
| public int | setMetadataFilter(java.util.Set allow, java.util.Set block)Set a filter for the metadata update notification and update
retrieval. The caller provides 2 set of metadata keys, allowed
and blocked. The blocked set always takes precedence over the
allowed one.
Metadata.MATCH_ALL and Metadata.MATCH_NONE are 2 sets available as
shorthands to allow/block all or no metadata.
By default, there is no filter set.
// Do our serialization manually instead of calling
// Parcel.writeArray since the sets are made of the same type
// we avoid paying the price of calling writeValue (used by
// writeArray) which burns an extra int per element to encode
// the type.
Parcel request = newRequest();
// The parcel starts already with an interface token. There
// are 2 filters. Each one starts with a 4bytes number to
// store the len followed by a number of int (4 bytes as well)
// representing the metadata type.
int capacity = request.dataSize() + 4 * (1 + allow.size() + 1 + block.size());
if (request.dataCapacity() < capacity) {
request.setDataCapacity(capacity);
}
request.writeInt(allow.size());
for(Integer t: allow) {
request.writeInt(t);
}
request.writeInt(block.size());
for(Integer t: block) {
request.writeInt(t);
}
return native_setMetadataFilter(request);
| public native void | setNextMediaPlayer(android.media.MediaPlayer next)Set the MediaPlayer to start when this MediaPlayer finishes playback
(i.e. reaches the end of the stream).
The media framework will attempt to transition from this player to
the next as seamlessly as possible. The next player can be set at
any time before completion. The next player must be prepared by the
app, and the application should not call start() on it.
The next MediaPlayer must be different from 'this'. An exception
will be thrown if next == this.
The application may call setNextMediaPlayer(null) to indicate no
next player should be started at the end of playback.
If the current player is looping, it will keep looping and the next
player will not be started.
| public void | setOnBufferingUpdateListener(android.media.MediaPlayer$OnBufferingUpdateListener listener)Register a callback to be invoked when the status of a network
stream's buffer has changed.
mOnBufferingUpdateListener = listener;
| public void | setOnCompletionListener(android.media.MediaPlayer$OnCompletionListener listener)Register a callback to be invoked when the end of a media source
has been reached during playback.
mOnCompletionListener = listener;
| public void | setOnErrorListener(android.media.MediaPlayer$OnErrorListener listener)Register a callback to be invoked when an error has happened
during an asynchronous operation.
mOnErrorListener = listener;
| public void | setOnInfoListener(android.media.MediaPlayer$OnInfoListener listener)Register a callback to be invoked when an info/warning is available.
mOnInfoListener = listener;
| public void | setOnPreparedListener(android.media.MediaPlayer$OnPreparedListener listener)Register a callback to be invoked when the media source is ready
for playback.
mOnPreparedListener = listener;
| public void | setOnSeekCompleteListener(android.media.MediaPlayer$OnSeekCompleteListener listener)Register a callback to be invoked when a seek operation has been
completed.
mOnSeekCompleteListener = listener;
| public void | setOnSubtitleDataListener(android.media.MediaPlayer$OnSubtitleDataListener listener)Register a callback to be invoked when a track has data available.
mOnSubtitleDataListener = listener;
| public void | setOnTimedTextListener(android.media.MediaPlayer$OnTimedTextListener listener)Register a callback to be invoked when a timed text is available
for display.
mOnTimedTextListener = listener;
| public void | setOnVideoSizeChangedListener(android.media.MediaPlayer$OnVideoSizeChangedListener listener)Register a callback to be invoked when the video size is
known or updated.
mOnVideoSizeChangedListener = listener;
| private native boolean | setParameter(int key, android.os.Parcel value)Sets the parameter indicated by key.
| public void | setRetransmitEndpoint(java.net.InetSocketAddress endpoint)Sets the target UDP re-transmit endpoint for the low level player.
Generally, the address portion of the endpoint is an IP multicast
address, although a unicast address would be equally valid. When a valid
retransmit endpoint has been set, the media player will not decode and
render the media presentation locally. Instead, the player will attempt
to re-multiplex its media data using the Android@Home RTP profile and
re-transmit to the target endpoint. Receiver devices (which may be
either the same as the transmitting device or different devices) may
instantiate, prepare, and start a receiver player using a setDataSource
URL of the form...
aahRX://<multicastIP>:<port>
to receive, decode and render the re-transmitted content.
setRetransmitEndpoint may only be called before setDataSource has been
called; while the player is in the Idle state.
String addrString = null;
int port = 0;
if (null != endpoint) {
addrString = endpoint.getAddress().getHostAddress();
port = endpoint.getPort();
}
int ret = native_setRetransmitEndpoint(addrString, port);
if (ret != 0) {
throw new IllegalArgumentException("Illegal re-transmit endpoint; native ret " + ret);
}
| public void | setScreenOnWhilePlaying(boolean screenOn)Control whether we should use the attached SurfaceHolder to keep the
screen on while video playback is occurring. This is the preferred
method over {@link #setWakeMode} where possible, since it doesn't
require that the application have permission for low-level wake lock
access.
if (mScreenOnWhilePlaying != screenOn) {
if (screenOn && mSurfaceHolder == null) {
Log.w(TAG, "setScreenOnWhilePlaying(true) is ineffective without a SurfaceHolder");
}
mScreenOnWhilePlaying = screenOn;
updateSurfaceScreenOn();
}
| public void | setSubtitleAnchor(android.media.SubtitleController controller, android.media.SubtitleController.Anchor anchor)
// TODO: create SubtitleController in MediaPlayer
mSubtitleController = controller;
mSubtitleController.setAnchor(anchor);
| public void | setSurface(android.view.Surface surface)Sets the {@link Surface} to be used as the sink for the video portion of
the media. This is similar to {@link #setDisplay(SurfaceHolder)}, but
does not support {@link #setScreenOnWhilePlaying(boolean)}. Setting a
Surface will un-set any Surface or SurfaceHolder that was previously set.
A null surface will result in only the audio track being played.
If the Surface sends frames to a {@link SurfaceTexture}, the timestamps
returned from {@link SurfaceTexture#getTimestamp()} will have an
unspecified zero point. These timestamps cannot be directly compared
between different media sources, different instances of the same media
source, or multiple runs of the same program. The timestamp is normally
monotonically increasing and is unaffected by time-of-day adjustments,
but it is reset when the position is set.
if (mScreenOnWhilePlaying && surface != null) {
Log.w(TAG, "setScreenOnWhilePlaying(true) is ineffective for Surface");
}
mSurfaceHolder = null;
_setVideoSurface(surface);
updateSurfaceScreenOn();
| public void | setVideoScalingMode(int mode)Sets video scaling mode. To make the target video scaling mode
effective during playback, this method must be called after
data source is set. If not called, the default video
scaling mode is {@link #VIDEO_SCALING_MODE_SCALE_TO_FIT}.
The supported video scaling modes are:
- {@link #VIDEO_SCALING_MODE_SCALE_TO_FIT}
- {@link #VIDEO_SCALING_MODE_SCALE_TO_FIT_WITH_CROPPING}
if (!isVideoScalingModeSupported(mode)) {
final String msg = "Scaling mode " + mode + " is not supported";
throw new IllegalArgumentException(msg);
}
Parcel request = Parcel.obtain();
Parcel reply = Parcel.obtain();
try {
request.writeInterfaceToken(IMEDIA_PLAYER);
request.writeInt(INVOKE_ID_SET_VIDEO_SCALE_MODE);
request.writeInt(mode);
invoke(request, reply);
} finally {
request.recycle();
reply.recycle();
}
| public void | setVolume(float leftVolume, float rightVolume)Sets the volume on this player.
This API is recommended for balancing the output of audio streams
within an application. Unless you are writing an application to
control user settings, this API should be used in preference to
{@link AudioManager#setStreamVolume(int, int, int)} which sets the volume of ALL streams of
a particular type. Note that the passed volume values are raw scalars in range 0.0 to 1.0.
UI controls should be scaled logarithmically.
if (isRestricted()) {
return;
}
_setVolume(leftVolume, rightVolume);
| public void | setVolume(float volume)Similar, excepts sets volume of all channels to same value.
setVolume(volume, volume);
| public void | setWakeMode(android.content.Context context, int mode)Set the low-level power management behavior for this MediaPlayer. This
can be used when the MediaPlayer is not playing through a SurfaceHolder
set with {@link #setDisplay(SurfaceHolder)} and thus can use the
high-level {@link #setScreenOnWhilePlaying(boolean)} feature.
This function has the MediaPlayer access the low-level power manager
service to control the device's power usage while playing is occurring.
The parameter is a combination of {@link android.os.PowerManager} wake flags.
Use of this method requires {@link android.Manifest.permission#WAKE_LOCK}
permission.
By default, no attempt is made to keep the device awake during playback.
boolean washeld = false;
if (mWakeLock != null) {
if (mWakeLock.isHeld()) {
washeld = true;
mWakeLock.release();
}
mWakeLock = null;
}
PowerManager pm = (PowerManager)context.getSystemService(Context.POWER_SERVICE);
mWakeLock = pm.newWakeLock(mode|PowerManager.ON_AFTER_RELEASE, MediaPlayer.class.getName());
mWakeLock.setReferenceCounted(false);
if (washeld) {
mWakeLock.acquire();
}
| public void | start()Starts or resumes playback. If playback had previously been paused,
playback will continue from where it was paused. If playback had
been stopped, or never started before, playback will start at the
beginning.
if (isRestricted()) {
_setVolume(0, 0);
}
stayAwake(true);
_start();
| private void | stayAwake(boolean awake)
if (mWakeLock != null) {
if (awake && !mWakeLock.isHeld()) {
mWakeLock.acquire();
} else if (!awake && mWakeLock.isHeld()) {
mWakeLock.release();
}
}
mStayAwake = awake;
updateSurfaceScreenOn();
| public void | stop()Stops playback after playback has been stopped or paused.
stayAwake(false);
_stop();
| private void | updateSurfaceScreenOn()
if (mSurfaceHolder != null) {
mSurfaceHolder.setKeepScreenOn(mScreenOnWhilePlaying && mStayAwake);
}
|
|