The multimedia support in MMAPI is quite
openly defined. The general-purpose design of MMAPI implies that the
supported set of protocols and content formats is not mandated and is
left for the Java ME implementation to define. Because of the lack of a
mandatory set, MMAPI also cannot specify which content types should work
on top of which protocol.
In Java ME on Symbian OS, the definition of MMAPI
content types and protocol support is derived from the native Symbian OS
Multimedia Framework (MMF). This direct link between the two frameworks
is because Java ME on Symbian OS tightly integrates with the native
Symbian OS MMF. A clear benefit of this approach is that Java ME
applications are consistent with the Symbian OS platform in regards to
supported content, playing and recording behavior.
In order to give a Java application a way to query the system about its multimedia capabilities, the MMAPI Manager class defines two methods that you can use:
Manager.getSupportedContentTypes(String protocol) returns the list of supported content types for a given protocol. For example, if the given protocol is http,
then the returned value would be the supported content types that can
be played back with the HTTP protocol. To get all the supported content
types, you can pass NULL as the protocol.
Manager.getSupportedProtocols(String contentType)
returns the list of supported protocols for a given content type, which
identify the locators that can be used for creating MMAPI players. For
example, if the given content type is video/mpeg, then the returned value would be the supported protocols that can be used to play back video/mpeg content. To get all the supported protocols, you can pass NULL as the content type.
The snippet of code below, which can also be added to
the Java ME Detectors MIDlet, detects the supported content types and
protocols and displays them:
private void detectSupport(){
try {
form.append("getSupportedContentTypes:");
String[] supportedContentTypes =
Manager.getSupportedContentTypes(null);
for(int i = 0; i < supportedContentTypes.length; i++){
form.append(supportedContentTypes[i]);
}
form.append("getSupportedProtocols:");
String[] supportedProtocols = Manager.getSupportedProtocols(null);
for(int i = 0; i < supportedProtocols.length; i++){
form.append(supportedProtocols[i]);
}
}
catch (Exception e) {
form.append(e.getMessage());
}
}
Another source of information is the MMAPI properties that can be queried by System.getProperty(String key). JSR-135 MMAPI defines the following keys:
supports.mixing
supports.audio.capture
supports.video.capture
supports.recording
audio.encodings
video.encodings
video.snapshot.encodings
streamable.contents.
Now that we know how to detect, let's see how we can
use the information. The first decision to take is when to use the
detection – during development or dynamically at application run time.
If you detect the multimedia capabilities during
development, you can package the multimedia content with the JAR
according to the support on the target device. If you are targeting more
than one device, you should package content whose type and format is
supported by all the targeted devices. The advantage is that, once the
suite is installed, the content is available on the device and does not
need to be fetched. The disadvantages are that the content increases the
JAR size and cannot be dynamically changed. So if your suite needs to
be signed you cannot change the content later without going through the
signing process again.
If you detect the multimedia capabilities at run
time, you can package in your JAR different versions of the same content
to be used according to the detected support or you can fetch the
appropriate content from a remote server. When the MIDlet starts, it
detects the available support and can play the content immediately by
creating a Player with the appropriate locator or can fetch
content of the appropriate type from the server. For caching, the MIDlet
can use the RMS or JSR-75 FileConnection which is supported on all
current Symbian smartphones. Handling can be improved further by
delegating to the remote server the decision about which content to use
or fetch. The MIDlet sends the query results to the remote server
without processing them and the decision logic resides on the server.
Your application can detect the multimedia capabilities once, the first
time it is launched or can do it in subsequent runs.
The advantages of this approach are a
smaller JAR size and the dynamic nature of content support. The
disadvantage is the dependency on a remote server which must be
accessible to download the content (the dependency can be minimized if
the content is fetched just once at first application run).