At the server:
> mkfifo fifo.flv
> APPTOMAKEFLVFILE fifo.flv &
> netcat -l -p 1234 < fifo.flv
At the client
> netcat SERVERADDRESS 1234 | ffplay pipe:
In this case, the APPTOMAKEFLVFILE is compiled from the output_example.c file that comes with the ffmpeg download from http://ffmpeg.mplayerhq.hu/
Thursday, 31 July 2008
To pipe data into ffplay over a network
Server: netcat -l -p 1234 < test.mpg
Client: netcat SERVERADDRESS 1234 | ffplay pipe:
Client: netcat SERVERADDRESS 1234 | ffplay pipe:
Tuesday, 29 July 2008
Compiling against libavcodec
When using the libraries supplied with ffmpeg (such as libavcodec, libavfilter etc etc) you must link to the libraries in the following order..
-lavformat -lavcodec -lavdevice -lavutil -lm -lswscale -lavfilter
If you see the error
--disable-bzlib
to the configuration line.
In this case, the configuration for ffmpeg was...
./configure --disable-vhook --enable-x11grab --enable-gpl --disable-bzlib
Also, remember to bracket the libavcodec header files in extern "C" tags if you are using C++. i.e.
extern "C" {
#include
#include
}
-lavformat -lavcodec -lavdevice -lavutil -lm -lswscale -lavfilter
If you see the error
undefined reference to `BZ2_bzDecompressInit'Then you must reconfigure and make ffmpeg, adding
--disable-bzlib
to the configuration line.
In this case, the configuration for ffmpeg was...
./configure --disable-vhook --enable-x11grab --enable-gpl --disable-bzlib
Also, remember to bracket the libavcodec header files in extern "C" tags if you are using C++. i.e.
extern "C" {
#include
#include
}
Friday, 25 July 2008
Blender real-time engine
Blender contains a real time engine with a physics engine and definable game logic.
'p' to launch.
'p' to launch.
stdout over IP
Linux: use netcat (or nc) to send stdout over a network. The '-l' switch listens at a port for a connection. Use pipes to send the data to an application.
Friday, 18 July 2008
mkfifo
The Linux command mkfifo creates a file that can be accessed by multiple processes simultaneously.
Wednesday, 16 July 2008
Streaming raw image data into Flash
As an alternative to using flv video in Flash, it is possible to stream raw image data across a socket to be used to fill a BitmapData object provided that each pixel is transferred as a 32-bit ARGB value.
By streaming from a socket (running in an external application) on the localhost, this displays about 9fps (820x640 pixels). Of course, there is no audio.
package
{
import flash.display.*;
import flash.geom.Rectangle;
import flash.net.*;
import flash.events.*;
import flash.utils.ByteArray;
import flash.errors.EOFError;
/**
* ...
*
*
* Open a socket
* read an image
* display
* repeat
*/
public class Main extends Sprite
{
private var imageSocket:Socket;
private var response:String;
private var imageBytes:ByteArray;
private var byteArrayOffset:Number;
private var myBitmap:Bitmap;
private var myBitmapData:BitmapData;
public function Main()
{
response = new String("");
imageBytes = new ByteArray();
byteArrayOffset = new Number();
byteArrayOffset = 0;
stage.stageWidth = 820;
stage.stageHeight = 640;
myBitmapData = new BitmapData(820, 640, true, 0xFFFFFF00);
myBitmap = new Bitmap(myBitmapData);
stage.addChild(myBitmap);
imageSocket = new Socket("localhost", 4242);
imageSocket.addEventListener(Event.CLOSE, closeHandler);
imageSocket.addEventListener(Event.CONNECT, connectHandler);
imageSocket.addEventListener(IOErrorEvent.IO_ERROR, ioErrorHandler);
imageSocket.addEventListener(SecurityErrorEvent.SECURITY_ERROR, securityErrorHandler);
imageSocket.addEventListener(ProgressEvent.SOCKET_DATA, socketDataHandler);
}
private function closeHandler(event:Event):void
{
imageSocket.flush();
trace("closeHandler: " + event);
trace(response.toString());
}
private function connectHandler(event:Event):void
{
trace("connectHandler: " + event);
}
private function ioErrorHandler(event:IOErrorEvent):void
{
trace("ioErrorHandler: " + event);
}
private function securityErrorHandler(event:SecurityErrorEvent):void
{
trace("securityErrorHandler: " + event);
}
private function socketDataHandler(event:ProgressEvent):void
{
//trace("socketDataHandler: " + event);
//this bit reads the stuff in the socket into a ByteArray object,
//note that it comes in chunks, so you need to add up the bytesloaded
//property each time the event is called until your message is the size
//you expect.
imageSocket.readBytes(imageBytes, byteArrayOffset, event.bytesLoaded);
byteArrayOffset = byteArrayOffset + event.bytesLoaded;
if (byteArrayOffset >= 820*640*4) //image is loaded
{
//do stuff with image
byteArrayOffset = 0;
//need to reset the position pointer in the ByteArray so that
//subsequent functions read from the start of the array
imageBytes.position = 0;
this.drawImage();
}
}
private function drawImage():void
{
try
{
var rect:Rectangle = new Rectangle(0, 0, 820, 640);
//this bit sets the pixel values in the BitmapData object to the values in the ByteArray
myBitmapData.setPixels(rect, imageBytes);
}
catch (e:EOFError)
{
trace(e);
}
}
}
}
By streaming from a socket (running in an external application) on the localhost, this displays about 9fps (820x640 pixels). Of course, there is no audio.
package
{
import flash.display.*;
import flash.geom.Rectangle;
import flash.net.*;
import flash.events.*;
import flash.utils.ByteArray;
import flash.errors.EOFError;
/**
* ...
*
*
* Open a socket
* read an image
* display
* repeat
*/
public class Main extends Sprite
{
private var imageSocket:Socket;
private var response:String;
private var imageBytes:ByteArray;
private var byteArrayOffset:Number;
private var myBitmap:Bitmap;
private var myBitmapData:BitmapData;
public function Main()
{
response = new String("");
imageBytes = new ByteArray();
byteArrayOffset = new Number();
byteArrayOffset = 0;
stage.stageWidth = 820;
stage.stageHeight = 640;
myBitmapData = new BitmapData(820, 640, true, 0xFFFFFF00);
myBitmap = new Bitmap(myBitmapData);
stage.addChild(myBitmap);
imageSocket = new Socket("localhost", 4242);
imageSocket.addEventListener(Event.CLOSE, closeHandler);
imageSocket.addEventListener(Event.CONNECT, connectHandler);
imageSocket.addEventListener(IOErrorEvent.IO_ERROR, ioErrorHandler);
imageSocket.addEventListener(SecurityErrorEvent.SECURITY_ERROR, securityErrorHandler);
imageSocket.addEventListener(ProgressEvent.SOCKET_DATA, socketDataHandler);
}
private function closeHandler(event:Event):void
{
imageSocket.flush();
trace("closeHandler: " + event);
trace(response.toString());
}
private function connectHandler(event:Event):void
{
trace("connectHandler: " + event);
}
private function ioErrorHandler(event:IOErrorEvent):void
{
trace("ioErrorHandler: " + event);
}
private function securityErrorHandler(event:SecurityErrorEvent):void
{
trace("securityErrorHandler: " + event);
}
private function socketDataHandler(event:ProgressEvent):void
{
//trace("socketDataHandler: " + event);
//this bit reads the stuff in the socket into a ByteArray object,
//note that it comes in chunks, so you need to add up the bytesloaded
//property each time the event is called until your message is the size
//you expect.
imageSocket.readBytes(imageBytes, byteArrayOffset, event.bytesLoaded);
byteArrayOffset = byteArrayOffset + event.bytesLoaded;
if (byteArrayOffset >= 820*640*4) //image is loaded
{
//do stuff with image
byteArrayOffset = 0;
//need to reset the position pointer in the ByteArray so that
//subsequent functions read from the start of the array
imageBytes.position = 0;
this.drawImage();
}
}
private function drawImage():void
{
try
{
var rect:Rectangle = new Rectangle(0, 0, 820, 640);
//this bit sets the pixel values in the BitmapData object to the values in the ByteArray
myBitmapData.setPixels(rect, imageBytes);
}
catch (e:EOFError)
{
trace(e);
}
}
}
}
Monday, 7 July 2008
BitmapData.copyChannel
In the flash.display.bitmapData class the channels are numbered as follows:
1 (red)
2 (green)
4 (blue)
8 (alpha)
This is important to know when using the copyChannel method.
1 (red)
2 (green)
4 (blue)
8 (alpha)
This is important to know when using the copyChannel method.
Improving Z depth rendering in Away3D
Away3D seems to use a mean Z-position algorithm for sorting Z-depth when rendering. i.e. It calculates the mean Z of the vertices in each triangle and uses that value to determine the Z depth of the triangle when rendering. To improve the accuracy of the rendering, increases the number of triangles in the object (at the cost of render speed).
i.e. For a plane use
myPlane.segmentsH = x;
myPlane.segmentsW = x;
where x is greater than 1 (the default).
i.e. For a plane use
myPlane.segmentsH = x;
myPlane.segmentsW = x;
where x is greater than 1 (the default).
Friday, 4 July 2008
Mapping flv video to geometry in Actionscript
To avoid having to use SWF encapsulated video files to map video to geometry in Flash 9, use the flash.media.Video class instead. The following extends the Plane class in the open source Away3D flash library to have a plane with a texture mapped video (flv). The constructor takes the location of the flv file (either a path to the file or as a URL).
This version of the class should now work with Away3D 3.0.0
package
{
import away3d.core.math.Number3D;
import away3d.primitives.Plane;
import away3d.materials.BitmapMaterial;
import away3d.materials.VideoMaterial;
import flash.geom.Point;
import flash.geom.Rectangle;
import flash.display.*;
import flash.events.*;
import flash.net.*;
import flash.media.Video;
public class videoPlane extends Plane
{
private var video: DisplayObject;
private var videoURL:String;
private var videoBitmapData: BitmapData;
private var videomaterial: BitmapMaterial;
private var alphaMap:BitmapData;
private var vidConnection:NetConnection;
private var vidStream:NetStream;
private var vid:Video;
private var infoClient:Object;
private var alphax:Number;
private var alphay:Number;
private var asset:String;
private var alphaBool:Boolean;
private var aspectRatio:Number;
public function videoPlane(assetLocation:String)
{
asset = assetLocation;
this.ownCanvas = true;
alphaBool = true;
trace("videoPlane()");
this.segmentsH = 8; //increases the number of triangles in the plane, and hence improves the accuracy of
this.segmentsW = 8; //the mean z algorithm used to determine Z-depth when rendering
vidConnection = new NetConnection();
vidConnection.addEventListener(NetStatusEvent.NET_STATUS, NetStatusHandler);
vidConnection.addEventListener(SecurityErrorEvent.SECURITY_ERROR, SecurityErrorHandler);
vidConnection.connect(null);
this.bothsides = true;
aspectRatio = 1024 / 576;
this.height = 50;
this.width = this.height * aspectRatio;
}
private function NetStatusHandler(event:NetStatusEvent):void
{
switch (event.info.code)
{
case "NetConnection.Connect.Success":
vidStream = new NetStream(vidConnection);
infoClient = new Object();
vidStream.client = infoClient;
vid = new Video();
vid.attachNetStream(vidStream);
vidStream.play(asset);
this.videoBitmapData = new BitmapData(vid.width, vid.height, true, 0xFF00ce);
videomaterial = new BitmapMaterial(this.videoBitmapData);
videomaterial.precision = 5;
this.material = videomaterial;
this.bothsides = true;
this.videoBitmapData.draw(vid);
alphaMap = new BitmapData(vid.width, vid.height, true, 0x7F000000);
vidStream.addEventListener(NetStatusEvent.NET_STATUS, vidStreamCompleteHandler);
//connectStream();
break;
case "NetStream.Play.StreamNotFound":
dispatchEvent(new StatusEvent(StatusEvent.STATUS, true, false, event.info.code, "Status"));
break;
}
}
private function SecurityErrorHandler(event:SecurityErrorEvent):void
{
dispatchEvent(new StatusEvent(StatusEvent.STATUS, true, false, event.type, "Status"));
}
private function vidStreamCompleteHandler(ns:NetStatusEvent):void
{
if (ns.info.code == "NetStream.Play.Stop")
{
vidStream.seek(0);
vidStream.resume();
}
}
public function updateVideo():void
{
//trace("updateVideo");
if(vid != null)
{
videomaterial = new BitmapMaterial(this.videoBitmapData);
this.material = videomaterial;
this.videoBitmapData.draw(vid);
}
}
public function pausePlayback():void
{
vidStream.pause();
}
public function resumePlayback():void
{
vidStream.resume();
}
}
}
This version of the class should now work with Away3D 3.0.0
package
{
import away3d.core.math.Number3D;
import away3d.primitives.Plane;
import away3d.materials.BitmapMaterial;
import away3d.materials.VideoMaterial;
import flash.geom.Point;
import flash.geom.Rectangle;
import flash.display.*;
import flash.events.*;
import flash.net.*;
import flash.media.Video;
public class videoPlane extends Plane
{
private var video: DisplayObject;
private var videoURL:String;
private var videoBitmapData: BitmapData;
private var videomaterial: BitmapMaterial;
private var alphaMap:BitmapData;
private var vidConnection:NetConnection;
private var vidStream:NetStream;
private var vid:Video;
private var infoClient:Object;
private var alphax:Number;
private var alphay:Number;
private var asset:String;
private var alphaBool:Boolean;
private var aspectRatio:Number;
public function videoPlane(assetLocation:String)
{
asset = assetLocation;
this.ownCanvas = true;
alphaBool = true;
trace("videoPlane()");
this.segmentsH = 8; //increases the number of triangles in the plane, and hence improves the accuracy of
this.segmentsW = 8; //the mean z algorithm used to determine Z-depth when rendering
vidConnection = new NetConnection();
vidConnection.addEventListener(NetStatusEvent.NET_STATUS, NetStatusHandler);
vidConnection.addEventListener(SecurityErrorEvent.SECURITY_ERROR, SecurityErrorHandler);
vidConnection.connect(null);
this.bothsides = true;
aspectRatio = 1024 / 576;
this.height = 50;
this.width = this.height * aspectRatio;
}
private function NetStatusHandler(event:NetStatusEvent):void
{
switch (event.info.code)
{
case "NetConnection.Connect.Success":
vidStream = new NetStream(vidConnection);
infoClient = new Object();
vidStream.client = infoClient;
vid = new Video();
vid.attachNetStream(vidStream);
vidStream.play(asset);
this.videoBitmapData = new BitmapData(vid.width, vid.height, true, 0xFF00ce);
videomaterial = new BitmapMaterial(this.videoBitmapData);
videomaterial.precision = 5;
this.material = videomaterial;
this.bothsides = true;
this.videoBitmapData.draw(vid);
alphaMap = new BitmapData(vid.width, vid.height, true, 0x7F000000);
vidStream.addEventListener(NetStatusEvent.NET_STATUS, vidStreamCompleteHandler);
//connectStream();
break;
case "NetStream.Play.StreamNotFound":
dispatchEvent(new StatusEvent(StatusEvent.STATUS, true, false, event.info.code, "Status"));
break;
}
}
private function SecurityErrorHandler(event:SecurityErrorEvent):void
{
dispatchEvent(new StatusEvent(StatusEvent.STATUS, true, false, event.type, "Status"));
}
private function vidStreamCompleteHandler(ns:NetStatusEvent):void
{
if (ns.info.code == "NetStream.Play.Stop")
{
vidStream.seek(0);
vidStream.resume();
}
}
public function updateVideo():void
{
//trace("updateVideo");
if(vid != null)
{
videomaterial = new BitmapMaterial(this.videoBitmapData);
this.material = videomaterial;
this.videoBitmapData.draw(vid);
}
}
public function pausePlayback():void
{
vidStream.pause();
}
public function resumePlayback():void
{
vidStream.resume();
}
}
}
Tuesday, 1 July 2008
Correcting texture distortion in Away3D
Use the precision attribute of the material.
i.e
var videomaterial: BitmapMaterial = new BitmapMaterial(this.myBitmapData);
videomaterial.precision = 2;
This will slow rendering down, but remove perspective distortion from the textures. Increasing the value improves render speed at the price of quality.
i.e
var videomaterial: BitmapMaterial = new BitmapMaterial(this.myBitmapData);
videomaterial.precision = 2;
This will slow rendering down, but remove perspective distortion from the textures. Increasing the value improves render speed at the price of quality.
Javascript
The Document Object Model (DOM) uses the script
tag to bracket Javascript code within an HTML document.
tag to bracket Javascript code within an HTML document.
Subscribe to:
Posts (Atom)