JavaFX Media

JavaFX 1.2 Application Development Cookbook

This book is a collection of code recipes, examples, and informative discourses designed to enable the reader to get started with creating JavaFX application quickly. The book is arranged as a series of loosely related code recipes that a reader can easily select to fit his or her needs. It exposes readers to a great variety of topics designed to satisfy different skill levels. Readers will learn about the language, animation techniques, paints, effects, JavaFX controls, integration of Swing components, styling with CSS, audio/video, deployment practices, and JavaFX integration with Adobe design tools.

also read:

What This Book Covers

Chapter 1, Getting Started with JavaFX… This is the “getting started” chapter of the book. It provides introductory materials to the platform, including installation instructions to get your environment set up. It also covers language basics such as classes, data types, function usage, variable declaration, data binding, triggers, Java and JavaFX integration.
Chapter 2, Creating JavaFX Applications… This chapter covers the essential building blocks of the JavaFX application framework, including primitive shapes, path, text, constructive area geometry, mouse/keyboard input, custom node, and window styling.
Chapter 3, Transformations, Animations, and Effects… This chapter explores the animation capabilities supported in JavaFX. You start with the Transition API to quickly build simple animations. The material continues to cover the KeyFrame API for more advanced animation sequences. You will learn about colors, effects, and how to create your own custom paint and effects.
Chapter 4, Components and Skinning… This chapter is divided into two sections. The first section shows readers how to use the set of standard JavaFX controls. The chapter also shows how to embed Swing components in your JavaFX scene graph. You will also learn how to create your own custom visual controls. The second section of the chapter introduces the reader to JavaFX‘s support for CSS. The reader will learn how to style controls using inline and externalized CSS to create skins.
Chapter 5, JavaFX Media… One of the exciting features of JavaFX is its inherent support for multimedia. JavaFX includes support for rendering of images in multiple formats and support for playback of audio and video on all platforms where JavaFX is supported. In
this chapter, readers learn how to display and manipulate images using the Image API. They will also learn how to playback both audio and video using the Media API. The chapter shows also how to create practical custom playback controls.
Chapter 6, Working with Data… JavaFX provides superb support for accessing and manipulating data both locally and remotely. In this chapter, readers are introduced to the Storage API for local data storage. It provides extensive coverage of JavaFX‘s HttpRequest API for accessing data on remote web servers. Readers will learn how to use JavaFX‘s XML and JSON parsers to build RESTful client mashups using popular services such as Google Map, Yahoo Weather, and Zillow Listing. Finally, the chapter explores JavaFX‘s built-in Chart API for data visualization.
Chapter 7, Deployment and Integration… This chapter provides coverage of the deployment mechanism supported by JavaFX. Readers will learn how to properly build and package their applications to target the different runtimes supported by JavaFX, including the web browser and the desktop. Readers learn how to create Java Web Startready applications using the build tools included in the SDK. The chapter shows how to write JavaScript that communicates with your JavaFX applet while running within the browser.
Chapter 8, The JavaFX Production Suite… This chapter covers JavaFX‘s integral support for designer tools from Adobe, including Illustrator and Photoshop. Readers are walked through the process of exporting creative assets using the JavaFX Production Suite
plugins available for these tools. The chapters also shows how to integrate exported objects from Photoshop and Illustrator into JavaFX.

JavaFX Media

In this chapter, we will cover the following topics:


  • Accessing media assets

  • Loading and displaying images with ImageView

  • Applying effects and transformations to images

  • Creating image effects with blending

  • Playing audio with MediaPlayer

  • Playing video with MediaView

  • Creating a media playback component

Introduction

One of the most celebrated features of JavaFX is its inherent support for media playback. As of version 1.2, JavaFX has the ability to seamlessly load images in different formats, play audio, and play video in several formats using its built-in components. To achieve platform independence and performance, the support for media playback in JavaFX is implemented as a two-tiered strategy:


  • Platform-independent APIs— the JavaFX SDK comes with a media API designed to provide a uniform set of interfaces to media functionalities. Part of the platform-independence offerings include a portable codec (On2’s VP6), which will play on all platforms where JavaFX media playback is supported.

  • Platform-dependent implementations— to boost media playback performance, JavaFX also has the ability to use the native media engine supported by the underlying OS. For instance, playback on the Windows platform may be rendered by the Windows DirectShow media engine (see next recipe).


This chapter shows you how to use the supported media rendering components, including ImageView, MediaPlayer, and MediaView. These components provide high-level APIs that let developers create applications with engaging and interactive media content.

Accessing media assets

In previous chapters, you have seen the use of variable _DIR__ when accessing local resources, but little detail was offered about its purpose and how it works. So, what does that special variable store? In this recipe, we will explore how to use the __DIR__ special variable and other means of loading resources locally or remotely.

Getting ready


The concepts presented in this recipe are used widely throughout the JavaFX application framework when pointing to resources. In general, classes that point to a local or remote resource uses a string representation of a URL where the resource is stored. This is especially true for the ImageView and MediaPlayer classes discussed in this chapter.

How to do it…


This recipe shows you three ways of creating a URL to point to a local or remote resource used by a JavaFX application. The full listing of the code presented here can be found in ch05/source-code/src/UrlAccess.fx.
Using the __DIR__ pseudo-variable to access assets as packaged resources:

	var resImage = "{__DIR__}image.png";

Using a direct reference to a local file:

	var localImage =
		"file:/users/home/vladimir/javafx/ch005/source-code/src/image.png";

Using a URL to access a remote file:

	var remoteImage = "http://www.flickr.com/3201/2905493571_a6db13ce1b_d.
	jpg"

How it works…


Loading media assets in JavaFX requires the use of a well-formatted URL that points to the location of the resources. For instance, both the Image and the Media classes (covered later in this chapter) require a URL string to locate and load the resource to be rendered. The URL must be an absolute path that specifies the fully-realized scheme, device, and resource location. The previous code snippets show the following three ways of accessing resources in JavaFX:

  • __DIR__ pseudo-variable— often, you will see the use of JavaFX‘s pseudo variable
    __DIR__, used when specifying the location of a resource. It is a special variable that stores the String value of the directory where the executing class that referenced __DIR__ is located. This is valuable, especially when the resource is embedded in the application’s JAR file. At runtime, __DIR__ stores the location of the resource in the JAR file, making it accessible for reading as a stream. In the previous code, for example, the expression {__DIR__}image.png explodes as jar:file:/users/home/vladimir/javafx/ch005/source-code/dist/source-code.jar!/image.png.

  • Direct reference to local resources—when the application is deployed as a desktop application, you can specify the location of your resources using URLs that provides the absolute path to where the resources are located. In our code, we use file:/users/home/vladimir/javafx/ch005/source-code/src/image.png as the absolute fully qualified path to the image file image.png.

  • Direct reference to remote resources—finally, when loading media assets, you are able to specify the path of a fully-qualified URL to a remote resource using HTTP.
    As long as there are no subsequent permissions required, classes such as Image and Media are able to pull down the resource with no problem. For our code, we use a URL to a Flickr image http://www.flickr.com/3201/2905493571_
    a6db13ce1b_d.jpg.


There’s more…


Besides __DIR__, JavaFX provides the _FILE__ pseudo variable as well. As you may well guess, __FILE__ resolves to the fully qualified path of the of the JavaFX script file that contains the __FILE__ pseudo variable. At runtime, when your application is compiled, this will be the script class that contains the __FILE__ reference.

Loading and displaying images with ImageView

If you have already checked out recipes in previous chapters, you know by now that JavaFX provides classes, which make it easy to load and display images. This recipe takes a closer look at the mechanics provided by the Image API to load and display images in your JavaFX applications.

Getting ready


This recipe uses classes from the Image API located in the javafx.scene.image package. Using this API, you are able to configure, load, and control how your images are displayed using the classes Image and ImageView . For this recipe, we will build a simple image browser to illustrate the concepts presented here. The browser allows users to load an image by providing its URL. You will use standard JavaFX controls, such as text boxes and buttons, to build the GUI. If you are not familiar with the standard GUI controls, review the recipe Creating a form with JavaFX controls from Chapter 4, Components and Skinning.

How to do it…


The code given next has been shortened to illustrate the essential portions involved in loading
and displaying an image. You can get a full listing of the code from ch05/source-code/src/image/ImageBrowserSimpleDemo.fx.

 
	def w = 800;
	def h = 600;
	var scene:Scene;
	def maxW = w * 0.9;
	def maxH = h * 0.9;
	def imgView:ImageView = ImageView{
		preserveRatio:true
		fitWidth: maxW fitHeight:maxH
		layoutX:(w-maxW)/2 layoutY:(h-maxH)/2
	};
	function loadImg(){
		view.image = Image{
			url:(scene.lookup("addr") as TextBox).text
			backgroundLoading:true
			placeholder:Image{url:"{__DIR__}loading.jpg"}
		}
	}
	def addrBar = Group{
		layoutX: 20
		layoutY: 20
		content:HBox {
			nodeVPos:VPos.CENTER
			spacing:7
			content:[
				Label{text:"Image URL:" textFill:Color.SILVER}
				TextBox{id:"addr" columns:80 promptText:"http://"
					action:function(){loadImg()}
				}
				Button{id:"btnGo" text:"Get Image"
					action:function(){loadImg()}
				}
			]
		}
	}

When the variables imgView and addrBar are placed on the scene and the application is executed, you will get the results as shown in the following screenshot:

The image shown in this screenshot is licensed under creative common. For additional information and licensing details, go to http://www.flickr.com/photos/motleypixel/2905493571/sizes/m/.

How it works…


Loading and displaying images in JavaFX involves two classes, Image and ImageView. While
class Image is responsible for accessing and managing the binary stream of the image, ImageView, on the other hand, is of the type Node and is responsible for displaying the loaded image on the scene. The code presented in this recipe lets the user enter a URL for an image and loads the image on the screen. Let’s take a closer look at the code and how it works:

  • The ImageView—the first significant item to notice is the declaration of an ImageView instance assigned to the variable imgView. This is the component that will display the image on the scene when it is fully loaded. We specify the properties fitWidth, fitHeight, and preserveRatio . These properties will cause imgView to stretch (if the image is smaller than specified) or shrink (if the image is larger than specified) while preserving the aspect ratio of the image.

  • Image URL bar— the form that captures the URL of the image to load is grouped in the Group instance variable addrBar. The form consists of a Label, a TextBox, and a Button instance. The TextBox instance has several properties set, including id=”addr”, which allows us to find a reference to it in the code. Both the TextBox and the Button instances have their action properties defined as a function that invokes function loadImg(). Therefore, when the TextBox has focus and the Enter key is pressed, or when the Button instance is clicked on, the image will be loaded.

  • Loading the image—the image is loaded by calling the function l oadImg(). It
    assigns an instance of Image to imgView.image. For the Image.url property, we use the Scene.lookup(id:String) function to retrieve an instance of the TextBox using its id of addr. For images that may take a while to load, we set up the following two properties:

    • To ensure that the application does not hang while the image loads, the property backgroundLoading:Boolean is set to true. This causes the GUI to remain responsive while an image loads.

    • The property p laceholder:Image is used to specify a local image to use while the remote image is loading, as shown in the previous screenshot. For example, we use the local image {__DIR__}
      loading.png. It gets loaded immediately and remains on the screen while the remote image loads. When the remote image is loaded, it replaces the placeholder image.



There’s more…

Format support

As of version 1.2, JavaFX has inherent supports for the most popular image formats (popularity here = web-supported), which includes PNG, JPG, BMP, and GIF. If you have
requirements for formats other than these, such as TIFF for instance, you will have to take matters into your own hands and use external image libraries such as Java Advanced Imaging (JAI) API (not covered here).

Asynchronous loading issues

As mentioned in the previous section, when you are loading images from locations with high latency (over the network for instance), you can use the asynchronous background-loading option for your image. This causes the image-loading operation to occur in a separate execution thread to keep your GUI responsive.
This, however, presents an issue, whereby if you want to determine the dimensions of the image (which is available only after the image is fully downloaded), it will report zero when loading asynchronously, as shown in the next segment:

 
	def img = Image{
		url:"http://someimage.com/img.png"
		backgroundLoading:true
	}//does not wait here, it continues to next line
	println (img.width); // prints 0

This is because the image is still being downloaded on the Image thread, and the main GUI thread did not wait for completion and continues with its execution. Therefore, when we query the property width of Image, it will be zero.
Unfortunately in version 1.2, the Image class does not offer event notification functions to know when image is done loading. If your code relies on the actual size of the image to be known, you must block with asynchronous loading (by setting backgroundLoading = false) to wait for the image to download and get the size. Another way around is to specify the size of the image yourself by specifying the dimensions (see next sub-section on Image resize and aspect ratio).

Image resize and aspect ratio

Another feature supported by Image and ImageView is the automatic resizing of the image. The Image class will attempt to resize the image when a value is provided for the properties. width:Number or height:Number. ImageView will attempt to do the same when the properties fitWidth:Number and fitHeight:Number are specified. Both classes support property preserveRation:Boolean, which forces the resize operation to maintain the aspect ratio of the original image while resizing to the specified dimensions as shown next:

	def imgView:ImageView = ImageView{
		preserveRatio:true
		<B>fitWidth:</B> 200
	};

The previous code will resize the image to a width of 200 pixels. Because the preserveRatio property is true, the height of the image is automatically calculated. This is useful especially if you do not know the actual size of the image ahead of time (see previous section).

See also



  • Chapter 4—Creating a form with JavaFX controls

  • Introduction

  • Accessing media assets

Applying effects and transformations to images

Now that you have learned how to load images, what can you do with them? Well, since ImageView is an instance of the Node class, your loaded images can receive the same treatment you would ordinarily provide, shapes, for example. In this recipe, we are going to extend the example from the previous recipe, Loading and displaying images with ImageView, to add image manipulation functionalities.

Getting ready


In this recipe, we are going to reach back to some of the concepts learned in previous chapters to extend the image browser example presented in the previous recipe. We will make use of JavaFX GUI controls and node effects. If you are not familiar with either of these topics, please review the recipes from Chapter 3, Transformations, Animations, and Effects, and Chapter 4, Components and Skinning.
The example presented here extends the image browser from the previous recipe to add image manipulation capabilities. The new version adds GUI controls to scale, rotate, add effects, and animate the loaded image.

How to do it…


The code snippet presented next has been abbreviated to concentrate on the more interesting aspects of the code. You can access the full code listing from ch05/source-code/src/ image/ImageBrowserExtendedDemo.fx.

	def w = 800;
	def h = 600;
	def maxW = w * 0.7;
	def maxH = h * 0.7;
	var scene:Scene;
	def slider = Slider{min:1 max:1.5 value:1}
	def imgView:ImageView = ImageView{
		preserveRatio:true
		fitWidth:bind if((slider.value*maxW) < w)
		maxW * slider.value else w
		fitHeight:bind if((slider.value*maxH) < h)
		maxH * slider.value else h
	};
	var anim = TranslateTransition{
		fromX:0 toX:w - maxW
		node:imgView repeatCount:TranslateTransition.INDEFINITE
		autoReverse:true
	}
	var rotateAngle = 0;
	... //Address Bar Group and loadImg() function not shown
	def footer = Group{
		layoutX: 20
		layoutY: h - 60
		content:HBox {
			spacing: 12
			content:[
				slider,
				Button{text:"Rotate" action:function(){
					rotateAngle = rotateAngle + 90;
					imgView.rotate = rotateAngle;
				}}
				HBox{spacing:7 content:[
					Button{text:"Reflection"
					onMouseClicked:function(e){
						imgView.effect =
							if(imgView.effect == null or
								or not (imgView.effect instanceof
								Reflection))
								Reflection{fraction:0.3 topOffset:0}
							else null
					}}
					... // Other effects omitted
					Button{text:"Sepia"
						onMouseClicked:function(e){
							imgView.effect = if(imgView.effect == null
								or not (imgView.effect instanceof
								SepiaTone)
							)
							SepiaTone{level:0.7}
							else null
						}}
						Button{text:"Animate"
							onMouseClicked:function(e){
								if(not anim.running){
									anim.play();
								}else{
									anim.stop();
								}
						}}
				]}
			]
		}
	}

When the ImageView, the Slider, and the other GUI controls are added to stage, and the application is executed, it will look like what is shown in the next screenshot. In it, you can see the reflection effect applied to the image.

How it works…


In the recipe Loading and displaying images with ImageView we have seen how to use the Image API to load and display local or remote images. This recipe extends the code in that recipe to not only load the image, but also apply effects and animations to it.
As shown in the previous screenshot, this version of the image browser includes a row of GUI controls at the bottom of the screen that are used to apply different transformations and effects to the loaded image. Let’s take a closer look at how the code works:

  • Scaling the image— using an instance of the S lider control you can dynamically grow or shrink the image. To do this, we bind the properties ImageView.fitWidth and ImageView.fitHeight to Slider.value. This causes the size of the image to grow or shrink dynamically, while maintaining proper image aspect ratio. The bound expression includes logic to ensure that the image does not grow excessively large when it is scaled up as shown below:
  •  
    	ImageView{
    		fitWidth:bind if((slider.value*maxW) < w)
    			maxW * slider.value else w
    		fitHeight:bind if((slider.value*maxH) < h)
    			maxH * slider.value else h
    	};
    

  • Image rotation— the Button instance with the label “Rotate” rotates the image instance by 90 degrees with each click by setting the imgView.rotate property.

  • Image effects— the next five buttons in the code apply effects refl ected in their
    respective names. These buttons apply the Reflection, Glow, GaussianBlur , Lighting (using a PointLight effect), and SepiaTone effect s to the image (only Refl ection and Sepia are listed in the code). All buttons work in the same way:
    if the effect currently applied to the image is null or the effect is not of the desired type, then apply the desired effect, otherwise, if the effect is already being applied, turn it off. This makes the button toggle between its assigned effect.

  • Image animation—the last Button control plays the TranslateTransition instance assigned to the variable anim. The transition animation moves the image from side-to-side indefinitely until the button is pressed again to stop the animation.


See also



  • Chapter 3—Transformation, animations, and effects

  • Chapter 4—Components and skinning

  • Loading and displaying images with ImageView

Creating image effects with blending

In the previous recipe, we saw how easy it is to build an application that loads, displays, and applies effects to images. In this recipe, we are going to explore how to create new visual effects by blending two separate image sources.

Getting ready


For this recipe, you will need to be familiar with the concepts of loading and displaying images in your application using the Image API. If necessary, review the recipe Loading and displaying images with ImageView. Part of the code also uses transition animation to slide the images one on top of the other. If you need to review topics regarding animation, refer to the recipe Creating simple animation with the Transition API from Chapter 3, Transformations, Animations, and Effects. Lastly, the recipe makes use of GUI controls to capture image URLs and action buttons to apply the effects. If you are not familiar with JavaFX‘s GUI controls, review the recipe Creating a form with JavaFX controls from Chapter 4, Components and Skinning.

How to do it…


The code listing given next is abbreviated to show the essential portions that drive the application. You can get the full listing of this code from ch05/source-code/src/image/
ImageBlendDemo.fx.

 
	var scene:Scene;
	def w = 800; def h = 600;
	def maxW = w * 0.4; def maxH = h * 0.5;
	def img1 = ImageView{
		translateX:10 translateY:10
		preserveRatio:true
		fitWidth:maxW fitHeight:maxH
	}
	def img2 = ImageView{
		translateX:w – maxW translateY:10
		preserveRatio:true
		fitWidth:maxW fitHeight:maxH
	}
	def imgPanel = Group {content:[img1, img2]}
	def anim = Timeline {
		keyFrames: [
			KeyFrame{time:1s
				values: [
					<B>img1</B>.translateX => (w - img1.fitWidth)/2
				]
			}
			KeyFrame{time:1s
				values: [
					img2.translateX => (w - img2.fitWidth)/2
				]
			}
		]
	}
	// fn to load img
	function loadImg(view:ImageView,url:String){
		view.effect = null;
		view.image = Image{
			backgroundLoading:true
			url:url
		}
	}
	// controls bottom of screen
	def toggleGrp = ToggleGroup{}
	def controls = Group{
		layoutY: h - 200
		content:[
			VBox{width:w spacing:12
				hpos:HPos.CENTER nodeHPos:HPos.CENTER content:[
					TextBox{id:"addr1" columns:60 promptText:"http://"
						action:function(){
							loadImg(img1,
							(scene.lookup("addr1") as TextBox).text)
					}}
					TextBox{id:"addr2" columns:60 promptText:"http://"
					action:function(){
						loadImg(img2,
						(scene.lookup("addr2") as TextBox).text)
					}}
					HBox{
						content:[
							RadioButton</B>{text:"ADD"
								toggleGroup:toggleGrp selected:true
							}
							... // other blending modes omitted
							RadioButton</B>{text:<B>"LIGHTEN"
								toggleGroup:toggleGrp
							}
						]
					}
					HBox{
						content:[
							RadioButton{text:"MULTIPLY"
							toggleGroup:toggleGrp
							}
							... //other blending modes omitted
							RadioButton{text:"SRC_OVER"
								toggleGroup:toggleGrp
							}
						]
					}
					Button{
						text:"Blend Images"
						font:Font.font("Sans Serif",
						FontWeight.BOLD, 18)
					effect:DropShadow{offsetX:3 offsetY:3}
					onMouseClicked:function(e){
						def mode = toggleGrp.selectedButton.text;
						imgPanel.blendMode = BlendMode.valueOf(mode);
						anim.rate = 1.0;
						anim.playFromStart();
					}
					onMouseReleased:function(e){
						anim.rate = -1.0;
						anim.play();
					}
				}
			]}
		]
	}

When the Group instances imgPanel and controls are placed on the stage, and the application is executed, it produces the next screenshot. The application lets users enter the URLs of two images and select a blend mode. When the Blend Images button is pressed, the images slide to overlap each other and apply the blend effect:

How it works…


The Group class (a node itself) allows the grouping of two or more nodes to be placed on the scene graph. One of the features of the Group node is its ability to apply a blending algorithm to the group’s members. It applies its algorithm to all children in its content property when a blend mode is provided through the blendMode:BlendMode property.
In the previous sample code provided, we use Group instance imgPanel to apply blending effects to two images placed in the group. Let’s take a closer look at how the application works:

  • The images—the first thing we do in the code is to declare two instances of ImageView, img1 and img2. To ensure that the images fit in a pre-determined dimension on the screen, we set the properties fitWidth and fitHeight on the two instances. Then, we place the two images in a Group instance called imgPanel, where they will receive blending effects.

  • The image animation— to make things a little interesting, the code uses an instance of Timeline to animate the two images. The first KeyFrame instance slides img1 from the left-hand side to the middle of the screen, and the second KeyFrame instance slides img2 from the right-hand side to the middle of the screen. The two images stack up in the middle of the screen where you can see the selected blending effect applied.

  • Loading the images—when the user types the URL location of the images in the TextBox instances, with property id=”addr1″ and id=”addr2″, and presses Enter, this invokes the function loadImg(). That function loads and attaches the loaded image to instances of ImageView img1 and img2, respectively.

  • Applying the blend—Group variable controls contains two rows of RadioButton instances (not all shown in previous code). For each instance of RadioButton, the code assigns the name of a BlendMode as its text content (that is, “ADD”, “COLOR_BURN”, “MULTIPLY”, and so on). When the user clicks on the button titled Blend Image, it creates a BlendMode object using the text of the selected radio button, and applies it to the imgPanel Group containing the images, as shown:

	def mode = toggleGrp.selectedButton.text;
	imgPanel.blendMode = BlendMode.valueOf(mode);

BlendMode.valueOf(:String) returns an instance of BlendMode based on
a String.

There’s more…


JavaFX supports a multitude of blending options. The following table shows a list of the more
interesting modes:

The BlendedMode class offers more blended modes, including RED, GREEN, BLUE, COLOR_DOGE, HARD_LIGHT, SOFT_LIGHT, SRC_ATOP, SRC_IN, SRC_OUT, and SRC_OVER.

See also



  • Chapter 3—Creating simple animation with the transition API

  • Chapter 4—Creating a form with JavaFX controls

  • Loading and displaying images with ImageView

Playing audio with MediaPlayer

Playing audio is another important aspect of any rich client platform. One of the celebrated features of JavaFX is its ability to easily playback audio content. This recipe shows you how to create code that plays back audio resources using the MediaPlayer class.

Getting ready


This recipe uses classes from the Media API located in the javafx.scene.media package. As you will see in our example, using this API you are able to load, configure, and playback audio using the classes Media and MediaPlayer. For this recipe, we will build a simple audio player to illustrate the concepts presented here. Instead of using standard GUI controls, we will use button icons loaded as images. If you are not familiar with the concept of loading images, review the recipe Loading and displaying images with ImageView in this chapter.
In this example we will use a JavaFX podcast from Oracle Technology Network TechCast series where NandiniRamani discusses JavaFX. The stream can be found at http://streaming. oracle.com/ebn/podcasts/media/8576726_Nandini_Ramani_030210.mp3.

How to do it…


The code given next has been shortened to illustrate the essential portions involved in loading and playing an audio stream. You can get the full listing of the code in this recipe from ch05/
source-code/src/media/AudioPlayerDemo.fx

 
	def w = 400;
	def h = 200;
	var scene:Scene;
	def mediaSource = "http://streaming.oracle.com/ebn/podcasts/media/
		8576726_Nandini_Ramani_030210.mp3";
	def player = MediaPlayer {media:Media{source:mediaSource}}
	def controls = Group {
		layoutX:(w-110)/2
		layoutY:(h-50)/2
		effect:Reflection{
			fraction:0.4 bottomOpacity:0.1 topOffset:3
		}
		content:[
			HBox{spacing:10 content:[
					ImageView{id:"playCtrl"
					image:Image{url:"{__DIR__}play-large.png"}
					onMouseClicked:function(e:MouseEvent){
						def playCtrl = e.source as ImageView;
						if(not(player.status == player.PLAYING)){
							playCtrl.image =
								Image{url:"{__DIR__}pause-large.png"}
								player.play();
						}else if(player.status == player.PLAYING){
							playCtrl.image =
								Image{url:"{__DIR__}play-large.png"}
							player.pause();
						}
					}
				}
				ImageView{id:"stopCtrl"
					image:Image{url:"{__DIR__}stop-large.png"}
					onMouseClicked:function(e){
						def playCtrl = e.source as ImageView;
						if(player.status == player.PLAYING){
							playCtrl.image =
								Image{url:"{__DIR__}play-large.png"}
							player.stop();
						}
					}
				}
			]}
		]
	}

When the variable controls is added to a scene object and the application is executed, it produces the screen shown in the following screenshot:

How it works…


The Media API is comprised of several components which, when put together, provides the mechanism to stream and playback the audio source. To playback audio requires two classes, including Media and MediaPlayer. Let’s take a look at how these classes are used to playback audio in the previous example.

  • The MediaPlayer— the first significant item in the code is the declaration and initialization of a MediaPlayer instance assigned to the variable player. To load the audio file, we assign an instance of Media to player.media. The Media class is used to specify the location of the audio. In our example, it is a URL that points to
    an MP3 file.

  • The controls—the play, pause, and stop buttons are grouped in the Group object called controls. They are made of three separate image files: play-large.
    png, pause-large.png, and stop-large.png, loaded by two instances of the ImageView class. The ImageView objects serve to display the control icons and to control the playback of the audio:


    • When the application starts, imgView displays image playlarge. png. When the user clicks on the image, it invokes its action-handler function, which firsts detects the status of the MediaPlayer instance. If it is not playing, it starts playback of the audio source by calling player.play() and replaces the playlarge.
      png with the image pause-large.png. If, however, audio is currently playing, then the audio is stopped and the image is replaced back with play-large.png.

    • The other ImageView instance loads the stop-large.png icon. When the user clicks on it, it calls its action-handler to first stop the audio playback by calling player.stop(). Then it toggles
      the image for the “play” button back to icon play-large.png.


As mentioned in the introduction, JavaFX will play the MP3 file format on any platform where the JavaFX format is supported. Anything other than MP3 must be supported natively by the OS’s media engine where the file is played back. For instance, on my Mac OS, I can play MPEG-4, because it is a supported playback format by the OS’s QuickTime engine

There’s more…


The Media class models the audio stream. It exposes properties to configure the location, resolves dimensions of the medium (if available; in the case of audio, that information is not available), and provides tracks and metadata about the resource to be played.
The MediaPlayer class itself is a controller class responsible for controlling playback of
the medium by offering control functions such as play(), pause(), and stop(). It also exposes valuable playback data including current position, volume level, and status. We will use these additional functions and properties to extend our playback capabilities in the recipe
Controlling media playback in this chapter.

See also



  • Accessing media assets

  • Loading and displaying images with ImageView

Playing video with MediaView

The previous recipe shows you how to play audio using the JavaFX Media API. This recipe builds on the versatility of the Media API and extends the previous recipe, Playing audio with MediaPlayer, and creates a video player with a few changes to the code.

Getting ready


This recipe uses classes from the Media API located in the javafx.scene.media package. As mentioned in the introduction of this recipe, the example presented here extends the code from the previous recipe to transform the audio player to now play video. We are going to reuse the same icons and the same logic to control the playback of the video. To review how to configure and use the Media API for playback, review the previous recipe Playing audio with MediaPlayer.
To illustrate video playback, the application plays back the award-winning, open-sourced, short, animated movie Big Buck Bunny. By default, the recipe will play the 854 x 480 H.264 version found at the address http://mirror.bigbuckbunny.de/peach/ bigbuckbunny_movies/big_buck_bunny_480p_h264.mov.

How to do it…


Similar to audio, playing video is simple. The abbreviated code given next highlights the portion of the code that is changed to be able to display video. You can see the full listing of the code at ch05/source-code/src/media/VideoPlayerDemo.fx.

 
	def w = 800;
	def h = 600;
	def maxW = w * 0.8;
	def maxH = h * 0.7;
	var scene:Scene;
	def mediaSource =
		"http://mirror.bigbuckbunny.de/peach/bigbuckbunny_movies/big_buck_
	bunny_480p_h264.mov";
	def player = MediaView{
		layoutX:(w - maxW)/2 layoutY:(h-maxH)/2
		mediaPlayer:MediaPlayer {media:Media{source:mediaSource}}
		fitWidth:maxW fitHeight:maxH
	}
	def controls = Group {
		layoutX:(w-110)/2
		layoutY:h-100
		effect:Reflection{
			fraction:0.4 bottomOpacity:0.1 topOffset:3}
			content:[
				HBox{spacing:10 content:[
					ImageView{id:"playCtrl"
						image:Image{url:"{__DIR__}play-large.png"}
						onMouseClicked:function(e:MouseEvent){
							def playCtrl = e.source as ImageView;
							if(not(player.mediaPlayer.status ==
								MediaPlayer.PLAYING)){
								playCtrl.image = Image{
									url:"{__DIR__}pause-large.png"
								}
								player.mediaPlayer.play();
							}else if(player.mediaPlayer.status ==
								MediaPlayer.PLAYING){
								playCtrl.image = Image{
									url:"{__DIR__}play-large.png"
								}
								player.mediaPlayer.pause();
							}
						}
					}
					ImageView{id:"stopCtrl"
					image:Image{url:"{__DIR__}stop-large.png"}
					onMouseClicked:function(e:MouseEvent){
						def playCtrl = e.source as ImageView;
						if(player.mediaPlayer.status ==
							MediaPlayer.PLAYING){
								playCtrl.image = Image{
									url:"{__DIR__}play-large.png"
								}
								player.mediaPlayer.stop();
							}
						}
					}
				]}
			]
		}

When the Group variable controls and the MediaView instance’s player are placed on the scene, the application will create a window as shown in the next screenshot.

How it works…


While playing audio only requires the use of the classes Media and MediaPlayer, playing video requires an additional class called the MediaView. It is of type Node and can be used to display the content of a video on the screen. Let’s take a closer look at the code:

  • The MediaView— the first major component to be initialized is the MediaView assigned to variable player. The code uses the MediaView instance to configure the dimensions and the location where the video will be rendered. In order to control playback, the code assigns the player.mediaPlayer property an instance of MediaPlayer, used to control playback. MediaPlayer is then assigned an instance of Media (through the property MediaPlayer.media) to specify the location of the video resource we want to playback.

  • The controls—the GUI controls in this example work the exact same way as described in Playing audio with MediaPlayer. We use a group of image icons to represent playback functions play, pause, and stop. When the play icon is pressed, it is starts playing the video by calling the player.mediaPlayer.play() function and toggles itself to the pause icon. When the pause icon is pressed, it pauses the video using function player.mediaPlayer.pause(). Finally, when the user presses the stop button, it makes a call to player.mediaPlayer.stop() to stop playback and toggles the play button back to the play icon.


There’s more…


Processing video is expensive. The JavaFX MediaView class supports properties which can be used to provide rendering-time hints to maximize playback performance. These Boolean properties include:

  • compositable:Boolean—if true, other nodes may overlay the MediaView node using transparency.

  • preserveRatio:Boolean—if true, the aspect ratio of the video is preserved when the node is resized through the fitWidth or fitHeight property.

  • rotatable:Boolean— when true, it allows the MediaView node to receive rotation requests through the rotate property.

  • transformable:Boolean— the node will only apply transformations through the transforms:Transform[] property when this is set to true.


See also



  • Accessing media assets

  • Playing audio with MediaPlayer

Creating a media playback component

The previous two recipes, Playing audio with MediaPlayer and Playing video with MediaView, show you how to build applications quickly to playback media sources with basic controls, such as play, pause, and stop. However, the Media API supports more functionalities than
what have been discussed so far. This recipe shows you how to build a custom media component to playback media sources providing extended functionalities such as fast forward, reverse, and timing information.

Getting ready


This recipe uses classes from the Media API located in the javafx.scene.media package.
The example presented here extends the code from the previous recipe Playing video with
MediaView to create a playback component
. The component will take advantage of the functionalities and runtime data provided by the Media API to extend the features of the video
player example. Before you continue, ensure that you are familiar with the materials covered in the recipes Playing audio with MediaPlayer and Playing video with MediaView.

How to do it…


The shortened code given next provides highlights of the more significant items involved in creating the playback component. You can access the full listing of the code from ch05/
source-code/src/media/MediaControllerComponent.fx.

  1. Let’s define class MediaController as CustomNode that encapsulates the
    playback icons/buttons and control logic:
  2. 	class MediaController extends CustomNode{
    		public var mediaPlayer:MediaPlayer;
    		var timestat = bind
    			"{%02d mediaPlayer.currentTime.toHours()
    			mod 12 as Integer}:"
    			"{%02d mediaPlayer.currentTime.toMinutes()
    			mod 60 as Integer}:"
    			"{%02d mediaPlayer.currentTime.toSeconds()
    			mod 60 as Integer}/"
    			"{%02d mediaPlayer.media.duration.toHours()
    			mod 12 as Integer}:"
    			"{%02d mediaPlayer.media.duration.toMinutes()
    			mod 60 as Integer}:"
    			"{%02d mediaPlayer.media.duration.toSeconds()
    			mod 60 as Integer}";
    			// image icons
    			def imgReverse = Image{url:"{__DIR__}reverse-small.png"};
    			def imgPlay = Image{url:"{__DIR__}play-small.png"};
    			def imgPause = Image{url:"{__DIR__}pause-small.png"};
    			def imgFfwd = Image{url:"{__DIR__}ffwd-small.png"};
    			def imgVolup = Image{url:"{__DIR__}volup-small.png"}
    			def imgVolDn = Image{url:"{__DIR__}voldown-small.png"};
    			def controls = Group {
    				content:[
    					HBox{spacing:10 content:[
    						// reverse button
    						ImageView{id:"reverseCtrl" image:imgReverse
    							onMousePressed:function(e:MouseEvent){
    								mediaPlayer.currentTime =
    								mediaPlayer.currentTime
    								- (mediaPlayer.media.duration * 0.01);
    							}
    						}
    						// play button
    						ImageView{id:"playCtrl" image:imgPlay
    							onMouseClicked:function(e:MouseEvent){
    								... // starts media playback
    							}
    						}
    						// fast forward
    						ImageView{id:"ffwdCtrl" image:imgFfwd
    							onMousePressed:function(e:MouseEvent){
    								mediaPlayer.currentTime =
    								mediaPlayer.currentTime
    								+ (mediaPlayer.media.duration * 0.01);
    							}
    						}
    						// volume up
    						ImageView{id:"voldn" image:imgVolDn;
    						onMouseClicked:function(e){
    							mediaPlayer.volume =
    							mediaPlayer.volume - 0.4;
    						}
    					}
    					// volume down
    					ImageView{id:"volup" image:imgVolup
    						onMouseClicked</B>:function(e){
    							mediaPlayer.volume =
    							mediaPlayer.volume + 0.4;
    						}
    					}
    				]}
    				// progress bar
    				Line{
    					startX:0 startY:40 endX:100 endY:40
    					stroke:Color.MAROON
    				}
    				Circle{
    					radius:5
    					fill:Color.MAROON
    					centerX:bind
    					if(mediaPlayer.media.duration > 0ms)
    						(mediaPlayer.currentTime /
    						mediaPlayer.media.duration)*100
    					else 5
    					centerY:40
    				}
    				Text{
    					x:105 y:35
    					textAlignment:TextAlignment.LEFT
    					textOrigin:TextOrigin.TOP
    					font:Font.font(“Sans Serif", 10)
    					content: bind timestat
    				}
    			]
    		}
    		override protected function create () : Node {
    			return controls
    		}
    	}
    

  3. The next code segment shows you how to use the MediaController class
    defined earlier:
  4.  
    	def w = 800;
    	def h = 600;
    	def maxW = w * 0.8;
    	def maxH = h * 0.7;
    	var scene:Scene;
    	def mediaSource = "http://mirror.bigbuckbunny.de/peach/bigbuckbunny_
    		movies/big_buck_bunny_480p_h264.mov";
    	def video = MediaView{
    		layoutX:(w - maxW)/2 layoutY:(h-maxH)/2
    		mediaPlayer:MediaPlayer {media:Media{source:mediaSource}}
    		fitWidth:maxW fitHeight:maxH
    	}
    	def <B>controls = MediaController</B> {
    		<B>mediaPlayer: video.mediaPlayer</B>
    		showReflection:true
    		layoutX: (w - 200)/2 layoutY:video.fitHeight + 50
    	}
    


When we place variable video and the instance of MediaController in a scene and execute the application, we get a screen as shown in the next screenshot:

How it works…


The custom class presented in this recipe implements a CustomNode which encapsulates the icons and logic for media playback control functions including reverse, play, fast-forward, volume up, and volume down. The class also provides visual feedback on the length and current progression of the video playback. Let’s take a closer look at the custom class:

  • Textual time progression— before we look at the control functions, we will look at how
    the component reports time progression for the playback. The first item involved in time progression feedback is the variable timestat (to which a Text object that displays progression information is bound). timestat is itself bound to several expressions that return values containing current time and total time of playback, using values from mediaPlayer.currentTime and mediaPlayer.media.duration. Since time is reported as a Duration type, we have to pluck out each time subdivision (hour, minute, seconds) individually using the mod operator. Then, each unit is formatted to be printed as zero-padded values as shown in the snippet below:
  •  
    	var timestat = bind
    		"{%02d mediaPlayer.currentTime.toHours()
    		mod 12 as Integer}:"
    		"{%02d mediaPlayer.currentTime.toMinutes()
    		mod 60 as Integer}:"
    	...
    

  • Visual time progression—to provide visual feedback of the progression of the playback,
    the media controller uses a custom progress bar composed of a Circle that slides along a Line instance. The line represents the total duration of the video, and the location of the circle (along the line) represents the current position of the playhead.
    To achieve this, the Circle.centerX property is bound to an expression that returns a ratio of mediaPlayer.currentTime/mediaPlayer.media.duration. This ratio is used to normalize the progress bar by multiplying it to the length of the line to get the current position of the circle, as shown in the snippet below:
  •  
    	Circle{
    		...
    		centerX:bind
    		if(mediaPlayer.media.duration > 0ms)
    			(mediaPlayer.currentTime /
    			mediaPlayer.media.duration)*100
    		else 5
    	}
    

  • The controls—a s before, the control buttons consist of image icons displayed by
    instances of ImageView. The custom component loads six icons that represent functionalities such as reverse, play, pause, fast-forward, volume up, and volume down.
    The play and pause icons, assigned to ImageView instance with id = “playCtrl”, use the same logic from previous media playback recipes (consult the recipe Playing audio with MediaPlayer for details). Let’s see how the others work:


    • To fast-forward and reverse, we use ImageView instances with id=”reverseCtrl” and id=”ffwdCtrl” respectively. When the user clicks on these icons, the code adds one percent of the
      total duration to (or subtracts from) mediaPlayer.currentTime property. This has the effect of moving the playhead in the desired direction.

    • To adjust the volume is even simpler. We use instances of ImageView with id=”volup” and id=”voldn” to control the volume. When the user clicks on the icon, it sets mediaPlayer.
      volume to the desired ratio. To increase the volume we add 0.4 to the current volume. To decrease the volume, we subtract 0.4 from the current volume level.

Comments

comments

Speak Your Mind

*