J2ME Guide – Part 8

As is well known, the number of devices on the market, and their features, are growing continuously; every now and then a new mobile phone or a PDA is launched on the market. At first, a mobile phone was employed just to make and to receive voice calls. Now, our device is a powerful object that allows us to listen to music, to take pictures and videos and to browse the web. These multimedia capabilities couldn’t be ignored by the MIDP 2.0 designers, which had to design an open, simple but powerful framework to manage multimedia data. The problem was always the same: the device has a small amount of memory and a not very powerful CPU. As they did for the others classes, they decided to start from a subset of an existing API, the JMF (Java Media Framework). They had two alternatives to do that:

-JMF 1.0

-JMF 2.0

The former one was too limited in terms of openness and of the number of media types it supported, while the latter was too complex to be included in a small device, expecially with CLDC 1.0 without floating point types. They also had to overcome the problem of dealing with devices with (or without) media capture capabilities (i.e. the camera). The solution was to create two different APIs:

Mobile Media API (JSR-135)

MIDP 2.0 Media API

The former is a very complete set of APIs to play and to record every type of data, standard or custom. Those APIs represent an optional package that can be present on a device. The latter is a small subset of the first and can be executed on a MIDP 2.0 compatible device. We can say more about that: every MIDP 2.0 compliant device must support MIDP 2.0 Media API and so it can do these things:

-Play sounds

-Manage different kinds of protocol: standard or custom

-Tone generation support

-Media lifecycle management: start, pause, stop

-Availability of some media controls such as volume or tone

-Metadata information management about what protocols and data the device supports

In this module we’ll see the main concepts about both of the APIs. We’ll see how to manage data with a framework that uses a Factory which is very similar to the Generic Connection Framework applied, in this case, to the media player and not to the Connection. We’ll see then how to play some type of data compatible with a specific device. We’ll finish our module with a simple MIDlet that allows us to take a picture and to send it to a simple Web site where it can be displayed.

How the MIDP 2.0 Media API works

We can now study what tools the MIDP 2.0 Media API gives us to manage a multimedia application. As we can see from the javadoc we have two packages:

javax.microedition.media

javax.microedition.media.control


The first one contains the media framework, while the second contains control implementations about volume and tone. The fundamental part of the framework is represented by the following components (see Fig. 1):

-the Manager class

-the Player interface

-the Control interface

Fig. 1

The UML diagrams show that the Manager class is the entry point of the framework that acts as a Factory for the Player and the Control implementations. Like other similar classes (i.e. the DriverManager in JDBC), the Manager class gives us also some methods to oobtain the device capability in terms of the media types supported and of the protocols used to get them. Later we’ll dive into details, but for now we’ll just look at the big picture about the operations we have to do to play a simple multimedia content using the API just described.

As we know, every kind of information is described by a content-type which is a string identifying the type of content we want to manage. Examples of content types are:

-audio/x-wav for wave audio file

-audio/basic for the “old” au audio file

-audio/midi for a midi sequence

-audio/x-tone-seq for a simple tone sequence

-audio/mpeg for the mp3 format

In our context the content type isn’t the only information we need. We also have to know how we can get the media: from the network with the http protocol, from a stream, from the local file system, from a capture device, etc. This information is contained in what we call locator, which is a string. By using a locator we want to describe the content type of the media, where it is, and the protocol we want to use to get it. This string will be the key of our Manager class (the Factory) to get the proper Player to employ. Before doing that, we want to know if the device can manage the media through some useful manager methods we’ll see in the next paragraph. If the device supports the media format, the next step is to get the Player. So, we use the Factory capability of the Manager to get the proper Player realization to play the media. As we can see, the Player is an interface so we don’t know which is the specific Player realization the device will make available. We just know that it is a player and so we’ll interact to it through its interface. The actual realization is device-specific. A Player is a very resource-intensive object, thus it’s very important to manage it properly. For this reason we always have to know in which step of the process the Player is, in order to manage system resources properly. To do that the Player generates some events which are notified to the registered listeners through the PlayerListener interface callback methods. We’ll have to create one or more PlayerListeners to be notified of the Player state changes. Different players can have different ways to interact with them. A music player may have a volume control, a video player may have controls for brightness, colors, etc.The responsibility to manage controls cannot be the Manager’s but must be the Player’s. As we can see from the UML diagram, the Player interface extends the Controllable interface, which describes operations to get a specific Control realization by the name of the complete set of available controls. So every Player owns a set of Controls to interact with. Different versions of the multimedia API differs also for the type and the number of available controls. As we can see, for the MIDP 2.0 Media API we have just two Control implementations which are:

- ToneControl

- VolumeControl

in the

$$ javax.microedition.media.control


package.

The Manager class

In the previous section we saw the “big picture” of the things we should do to play a media. Now we’ll see how we can do this with the Manager class, which has to:

-Give information about the content types and protocols the device supports

-Act as Factory for the player given the locator

-Play a single tone given the frequence, the duration and the volume

To ask the device which are the protocols and content types that it can manage, we can use a set of static methods. If we want to know which are the content types supported for a given protocol, we can use this method:

public static String[] getSupportedContentTypes(String protocol);


which returns an array of strings describing the content types. We can also do the opposite with the method:

public static String[] getSupportedProtocols(String contentType);


which returns an array of strings describing the protocols available to get contents of the type specified. For instance, if we want to know what are the content types available through the HTTP protocol, we can use this method

String[] httpTypes = Manager. getSupportedContentTypes("HTTP");


If we want to know what protocols are available for getting a wav sound file we can use this method:

String[] wavProtocols = Manager. getSupportedProtocols("audio/x-wav");


If we specify a null value as the input parameter, the methods will return all the content types or protocols available for the device. This information is clearly partial because if the device manages a specific protocol, this doesn’t mean that every content type is compatible with it.

To test your device for protocols and for supported media types, we created the midlet TestContentTypeMIDlet. Here we can see a pratical use of the methods just described.

As we said before, a Manager is a Factory for the Player realization. We can do that using two different overloaded methods. The first one is the static method:

public static Player createPlayer (String loc) throws IOException,MediaException;


with the locator as the input parameter. If the locator value is not supported an exception is thrown. In the signature of the method we see two exceptions types. The first is fired in the case of connection problems using the protocol specified with the locator, while the second is fired when the proper player is not available. When we get the Player reference, we have to deal also with another important exception which is not listed in the signature because it’s a RuntimeException. We’re talking about a SecurityException which is thrown when we deny the device the playing of the media when the device prompts us the request. The reproduction of a media is sometimes an operation that needs an explicit permission from the user. It depends on the permission policy of the specific device.
This method is useful when we want to get the resource through the network as in this example:

- - -

Player player = null;

// Locator to get the media

String locator = "http://www.massimocarli.it/midpbook/media/hello.wav";

try{

// Get the Player

player = Manager.createPlayer(locator);

// Player lifecycle management

- - -

}catch(MediaException me){

// Manage MediaException

}catch(IOException ioe){

// Manage IOException

}catch(SecurityException se){

// Manage SecurityException

}

- - -


The second – very useful – overloaded method has this signature:

public static Player createPlayer (InputStream is,String type) throws IOException,MediaException;


In this case we have the InputStream to read from and we ask the Manager to get the Player that reads data of the specified type from that stream. This is very useful when we want to get the data, for instance, from the same jar of the MIDlet as a resource. In this case we can use a simple code like this:

- - -

Player player = null;

// Name of the media file in the jar context

String fileInJar = "/media/hello.wav";

// Mime type of the relative media

String contentType = "audio/x-wav";

try{

// Get the InputStream from the jar file

InputStream is = getClass().getResourceAsStream(fileInJar);

// Get the related PLayer

player = Manager.createPlayer(is, contentType);

// Player lifecycle management

- - -

}catch(Exception me){

// Generic exception management

}

- - -


The last Manager method is about tone. Through the static method:

public static void playTone(int note,int duration,int volume) throws MediaException


we can play a tone given its note, duration and volume. For instance we can play two notes – an A and a F – for a second with this simple code.

- - -

try {

int F = 0x41;

int A = 0x45;

int SECOND = 1000;

// Play an F

Manager.playTone(F,SECOND,100);

// PLay an A

Manager.playTone(A,SECOND,100);

} catch (MediaException me) {

// Generic exception management

}

- - -


We won’t go into deeper details about tones, but we can say that in this example the two notes will be played togheter. If we want to reproduce a sequence of notes we have to manage it with a Player associated to the audio/x-tone-seq content type.

The Player class

Ok, we got the Player for our media. What can we do now? We know that the Player allows us to play a media and gives us a set of controls to interact with it. To do its job, the Player needs some resources and so it’s very important that the application knows its state and manages its state transitions when needed. The states into which a Player can be are the ones in the following table (static constants of the Player interface):

Player state Description
UNREALIZED This is the starting state of a Player, when it doesn’t know anything about the media it has to manage
REALIZED The Player knows everything about the media it has to manage but it doesn’t start it and it doesn’t own the resources it needs.
PREFETCHED Now the Player owns all the resources it needs and it’s ready to start playing.
STARTED The Player is playing.
CLOSED The player has finished to play the media and has released most of the resources. In this state the Player cannot be reused.

To see how we can manage the Player states we can look at Fig. 2. When the Player is created through the Manager factory method createPlayer(), it’s in the state described by the constant Player.UNREALIZED. In this state it doesn’t know all the information it needs to reproduce the media and so it can’t give information about that, neither can create a Control implementation. In this state we can do two different things. The first is to invoke this method:

public void close()


which puts the Player in the CLOSED state. As we can see from the state diagram we can invoke this method at any time. The method closes the Player. Closing a Player after creating it is not very useful, so we choose the second option which is the invocation of the method:

public void realize() throws MediaException


which, if successful, puts the Player into the REALIZED state. The method can throw a MediaException if some problem arises during the media examination for resources allocation. In this process, the Player parses some media data to estimate the type of the resources it needs to reproduce it. Often, in this step, the Player also acquires non-exclusive resources. If the Player is into the REALIZED state it cannot return to UNREALIZED state. The method:

public void deallocate()


can be invoked, in fact, only after the realize() method is invoked but before the operation completes. We can see this also in the state diagram.
If the realize() method terminates successfully the Player is now into the REALIZED state into which it knows all about the media, it owns non-exclusive resources but not exclusive ones. To get all resources it needs we have to put it into the PREFETCHED state with the method:

public void prefetch() throws MediaException


If the method executes successfully, the Player has now all the resources it needs to start. If some resource is not available the method may throw a MediaException. At this time some device can also buffer some information to improve performance. An important thing is that we can invoke the prefetch() method also when we are in the UNREALIZED state causing the implicit invokation of the realize() method before it. This is also a characteristic of the start() method that we can invoke to start reproducing the media. The signature of this method is:

public void start() throws MediaException


The other possibility when we’re into the PREFETCHED state is to invoke the method deallocate() we’ve already seen. This method puts the Player into the REALIZED state deallocating all exclusive resources.
If the start method executes successfully the Player is into the STARTED state. Here we can invoke the method:

public void stop() throws MediaException


returning to the PREFETCHED state. At every time we can invoke the close() method to free resources and to kill the Player putting it into the STOPPED state.
Playing a media is often simpler than what it may seem. In most cases we just have to create the Player and to start it as shown in this example:

- - -

Player player = null;

// Filename into the jar file

String fileInJar = "/media/hello.wav";

// Content type to reproduce

String contentType = "audio/x-wav";

try{

// Get the inputstream from the file into the jar

InputStream is = getClass().getReourceAsStream(fileInJar);

// Get the Player

player = Manager.createPlayer(is, contentType);

// Manage PLayer lifecycle just calling start method

player.start();

- - -

}catch(Exception me){

// Generic Exception management

}

- - -


An important thing we have to keep in mind is the fact that every Player should execute in a single thread. So the operations just described should be contained into the run() method of a Thread.

Fig. 2

Monitoring state transitions

Now we’re able to load a media file and play it with a Player we got from the Manager. In the previous block of code we just called the start method and waited for the media reproduction to finish. What if we need to manage Player’s state transitions? What if we want to do something when the Player finishes playing ? To do this we can use the PlayerListener interface we’ve talked about before. If we want to be notified about the Player’s state transitions and other Player’s information, we just need to create a PlayerListener realization and to register it to the Player which will notify events via the following listeners’ callback method:

public void playerUpdate(Player player, String event, Object eventData)

As we can see from the method signature, the Player will send information about itself, the name of the event and some event-related information. The list of possible information types is contained into the following table (each event name corresponds to a constant in the PlayerListener interface):

Event Description
STARTED The player is started
STOPPED The player is stopped
CLOSED The player is closed
END_OF_MEDIA The Player reached the end of the media
DURATION_UPDATED The duration of the Player changed
VOLUME_CHANGED The volume of the Player changed
DEVICE_AVAILABLE The device notifies that an exclusive resource the Player needs is now available because it’s been released by a higher priority process
DEVICE_UNAVAILABLE The device notifis that an exclusive resource the Player needs is now not available because acquired by a higher priority process
ERROR The device notifis that an exclusive resource the Player needs is now not available because acquired by a higher priority process

We can remark that the events are not all about state transitions but also about changes in the volume and in the duration. To better describe the use of this interface and the state transitions we created the midlet PlayMIDIMIDlet.

Player parameters

Here we want to describe some other useful methods of the Player interface. We said that when a Player is in the REALIZED state, it knows everything about the media it has to play and the resources it needs to do that. In this state the Player owns some information it can expose through the methods listed in this section. The method:

public String getContentType()


simply returns the content type of the media that’s being played back. Obviously, if we use the overloaded createPlayer() method that specifies this information, the value returned must be the same. If we use the other version of the createPlayer() we don’t specify this information, which then must be read from the media itself.

In the example about the PlayerListener we saw some numeric values displayed on the screen. They were about what is called media duration. The strange thing is that this time is not expressed in milliseconds (10-3 sec) but in microseconds (10-6sec). Sometimes this information is not available and in this case the method returns the value described by the constant Player.TIME_UNKNOWN.
If in the process of reproduction of the media we’d like to know the elapsed playing time, we can use this method:

public long getMediaTime()


The starting point of the media reproduction time is 0, while the end point corresponds to the value of the media duration. In this case, if the information is not available, the method will return Player.TIME_UNKNOWN. The interesting thing about the media time value is that we also may change it, and so we can move along the media. We can do this with the method:

public long setMediaTime(long now) throws MediaException


The return value is the point of the media closest to the one passed as input parameter. In fact, it’s not always possible to set the media time value as we wish; instead, allowable values are a finite set of points. If the media time is not available the method throws a MediaException.
Maybe the most important information the Player can give us is about its state. We can retrieve it through the method:

public int getState()


which returns a value that describes the state, that we can compare with a set of Player interface constants listed in the following table.

Player Status Description
Player.UNREALIZED UNREALIZED state
Player.REALIZED REALIZED state
Player.PREFETCHED PREFETCHED state
Player.STARTED STARTED state
Player.CLOSED CLOSED state

We saw that when a Player finishes to reproduce the media it notifies a END_OF_MEDIA event to all its listeners. Through the method:

public void setLoopCount(int count)


we may tell the player to reproduce the media a specified number of times. The default is 1; if we set a value equal to -1, the player will never stop.

Something about the Control interface

As we read before, MIDP 2.0 MEDIA API gives us two different Control implementations about Volume and Tone. Here we’ll not study in detail this implementation (you can simply read the Javadoc) but we’ll describe how we can use a generic Control.
When we get a Player we can invoke on it one of these two methods described by the Controllable interface:

public Control[] getControls()


or

public Control getControl(String controlType)


The first method returns the list of all the available Controls for a given Player. The second one returns a specific Control implementation given its name. The Control interface is very interesting, since it has no methods. This isn’t unusual in Java; just think about the Remote, Serializable or EventListener interfaces in J2SE. This means that a Control can be everything (i.e. an instance of any class). The only requirement is that it implements the Control interface. As we’ll see in the example, sometimes a Control is a Screen and so it has a GUI we can use to interact with the player. A Control is, for instance, an object which displays camera pictures, or an object that shows a set of controls for a tuner. This is the way a device can manage and control any media type in an open way. It can create its own Player implementation and fill it with all the Controls it needs. What the MMAPI does is to supply more Player Control implementations for media reproduction and acquisition.

A more complex example

The MIDP 2.0 MEDIA API allows us to play a number of media; the list of supported media formats is device-dependent. The MMAPI also allows us to acquire media files. Here we’ll create a MIDlet which acquires an image and sends it to a specified URL. The picture will be visible on the site represented by the URL. The code to do that is in the SnapshotMIDlet midlet. We’re going to describe the part of the midlet regarding the media management, contained into the VideoCamera class. Most of the important concepts are at the beginning of the file. As you can see, we have a SNAPSHOT_LOCATOR constant that contains the locator for video acquisition. Then we can see the references to the Player and to the VideoControl. It’s important to notice that the VideoCamera class implements the Runnable interface and so it can be executed into a Thread. The initPlayer() method does most of the work. It asks the Manager to create the Player implementation for the capture locator, to realize it and to create the VideControl through the useVideoControl() method. Here we ask the player to create the VideoControl that will allow us to see the captured video on a Canvas on our display. We won’t go further into the details of this midlet’s code. It’s just an example of what you can do with the MMAPI. The complete application is very nice and can let you capture a snapshot and send it to the http://www.pikmik.org site to be voted. Enjoy the MIDLet!

Conclusion

In this module we covered some important concepts regarding the Media management in a mobile environment We learned how to play sounds and do more complex things employing the MIDP 2.0 MEDIA API. In particular, we dealt with the Manager class, the Player class and the Control interface.

References

[1] MIDP 2.0 API http://java.sun.com/javame/reference/apis/jsr118/

[2] API MIDP 2.0 documentation

Related Download: work.zip

Comments

comments

Leave a Reply

Your email address will not be published.


*