When using ARToolKit to open a video stream, one of the following functions / code snippets is called:
arVideoOpen(char *vconf);
or
AR2VideoParamT vid;
vid = ar2VideoOpen(char *vconf);
These functions take a single parameter, vconf, which is a character string.
This character string encapsulates operating-system dependent directions as
to how to find, open and configure the video stream.
In order to assist you in determining a set of parameters, the functions
arVideoDispOption();
or
ar2VideoDispOption();
are designed to output some help text.
The text is reproduced here for your convenience.
Select an operating system / driver for which you wish to view the help text
ARVideo may be configured using one or more of the following options,
separated by a space:
-node=N
specifies detected node ID of a FireWire camera (-1: Any).
-card=N
specifies the FireWire adaptor id number (-1: Any).
-mode=[320x240_YUV422|640x480_RGB|640x480_YUV411]
specifies input image format.
-rate=N
specifies desired framerate of a FireWire camera.
(1.875, 3.75, 7.5, 15, 30, 60)
-[name]=N where name is brightness, iris, shutter, gain, saturation, gamma, sharpness
(value must be a legal value for this parameter - use coriander to find what they are
ARVideo via GStreamer may be configured using a configuration string following the gst-launch syntax. Some examples (using the ARTOOLKIT_CONFIG environment variable, bash syntax):
$> export ARTOOLKIT_CONFIG="filesrc location=gstreamer_test_xvid.avi ! decodebin ! ffmpegcolorspace ! capsfilter caps=video/x-raw-rgb,bpp=24 ! identity name=artoolkit ! fakesink"
$> export ARTOOLKIT_CONFIG="v4l2src device=/dev/video0 use-fixed-fps=false ! ffmpegcolorspace ! capsfilter caps=video/x-raw-rgb,bpp=24 ! identity name=artoolkit ! fakesink"
arVideoOpen("videotestsrc ! capsfilter caps=video/x-raw-rgb,bpp=24 ! identity name=artoolkit ! fakesink");
$> export ARTOOLKIT_CONFIG="rtspsrc location=rtsp://somertspstreamingserver:554/live.sdp ! rtpmp4vdepay ! decodebin ! ffmpegcolorspace ! capsfilter caps=video/x-raw-rgb,bpp=24 ! identity name=artoolkit ! fakesink sync=0"
ARVideo may be configured using one or more of the following options,
separated by a space:
-mode=[PAL|NTSC]
specifies TV signal mode.
ARVideo may be configured using one or more of the following options,
separated by a space:
DEVICE CONTROLS:
-dev=filepath
specifies device file.
-channel=N
specifies source channel.
-noadjust
prevent adjusting the width/height/channel if not suitable.
-width=N
specifies expected width of image.
-height=N
specifies expected height of image.
-palette=[RGB|YUV420P]
specifies the camera palette (WARNING:all are not supported on each camera !!).
IMAGE CONTROLS (WARNING: every options are not supported by all camera !!):
-brightness=N
specifies brightness. (0.0 <-> 1.0)
-contrast=N
specifies contrast. (0.0 <-> 1.0)
-saturation=N
specifies saturation (color). (0.0 <-> 1.0) (for color camera only)
-hue=N
specifies hue. (0.0 <-> 1.0) (for color camera only)
-whiteness=N
specifies whiteness. (0.0 <-> 1.0) (REMARK: gamma for some drivers, otherwise for greyscale camera only)
-color=N
specifies saturation (color). (0.0 <-> 1.0) (REMARK: obsolete !! use saturation control)
OPTION CONTROLS:
-mode=[PAL|NTSC|SECAM]
specifies TV signal mode (for tv/capture card).
ARVideo may be configured using one or more of the following options,
separated by a space:
printf(" -pixelformat=cccc\n");
printf(" Return images with pixels in format cccc, where cccc is either a\n");
printf(" numeric pixel format number or a valid 4-character-code for a\n");
printf(" pixel format.\n");
printf(" The following numeric values are supported: \n");
printf(" 24 (24-bit RGB), 32 (32-bit ARGB), 40 (8-bit grey)");
printf(" The following 4-character-codes are supported: \n");
printf(" BGRA, RGBA, ABGR, 24BG, 2vuy, yuvs.\n");
printf(" (See http://developer.apple.com/quicktime/icefloe/dispatch020.html.)\n");
printf(" -fliph\n");
printf(" Flip camera image horizontally.\n");
printf(" -flipv\n");
printf(" Flip camera image vertically.\n");
printf(" -singlebuffer\n");
printf(" Use single buffering of captured video instead of triple-buffering.\n");
-nodialog
Don't display video settings dialog.
-width=w
Scale camera native image to width w.
-height=h
Scale camera native image to height h.
-fps
Overlay camera frame counter on image.
-grabber=n
With multiple QuickTime video grabber components installed,
use component n (default n=1).
N.B. It is NOT necessary to use this option if you have installed
more than one video input device (e.g. two cameras) as the default
QuickTime grabber can manage multiple video channels.
-pixelformat=cccc
The following numeric values are supported:
24 (24-bit RGB), 32 (32-bit ARGB), 40 (8-bit grey)
The following 4-character-codes are supported:
BGRA, RGBA, ABGR, 24BG, 2vuy, yuvs.
(See http://developer.apple.com/quicktime/icefloe/dispatch020.html.)
-fliph
Flip camera image horizontally. (Added in ARToolKit v2.72.)
-flipv
Flip camera image vertically. (Added in ARToolKit v2.72.)
-singlebuffer
Use single buffering of captured video instead of triple-buffering.
(Added in ARToolKit v2.73.)
ARVideo may be configured using one or more of the following options,
separated by a space:
-size=[FULL/HALF]
specifies size of image.
-device=N
specifies device number.
-bufsize=N
specifies video buffer size.
In ARToolKit 2.71 and later, the video configuration is specified in an XML file, conforming to the DSVideoLib XML Schema. This schema is documented in the file "DsVideoLib.xsd" inside the DSVL-0.0.8b package. You can use an XML Schema viewer to view the schema file, and a text editor or XML editor to edit your own configuration file.
The pathname of the configuration file is then specified in the video configuration string.
parameter is a file name (e.g. 'config.XML') conforming to the DSVideoLib XML Schema (DsVideoLib.xsd).
parameter format is either NULL or a list of tokens, separated by commas ","
BINARY TOKENS:
--------------
flipH : flip image horizontally (WARNING: non-optimal performance)
flipV : flip image vertically (WARNING: non-optimal performance)
showDlg : displays either WDM capture filter's property page or
MSDV decoder format dialog (depending on source media type).
only applies to WDM_CAP, will be ignored for AVI_FILE
loopAVI : continuously loops through an AVI file (applies only to AVI_FILE)
noClock : does not use a Reference Clock to synchronize samples;
use this flag for offline post-processing (applies only to AVI_FILE)
renderSec : render secondary streams (applies only to AVI_FILE)
An AVI file can contain an arbitrary number of multiplexed A/V substreams,
usually there are at most 2 substreams (1st: video, 2nd: audio).
the AVI_FILE input module will only try to render stream 0x00 (assuming that
it's video) and ignore the remaning substreams.
Use this flag to force IGraphBuilder->Render(IPin*) calls on substreams 1..n
DO NOT SET this flag if your AVI file contains more than one video stream
PARAMETRIZED TOKENS:
--------------------
inputDevice=? : supported parameters:
"WDM_CAP" (WDM_VIDEO_CAPTURE_FILTER) use the DirectShow WDM wrapper
to obtain live video input from a streaming capture device
such as a IEEE 1394 DV camera or USB webcam.
OPTIONAL: set deviceName=? and/or ieee1394id=? for better
control over the choice of suitable WDM drivers
"AVI_FILE" (ASYNC_FILE_INPUT_FILTER) use an Asynchronous File Input
Filter to read & decode AVI video data
NOTE: be sure to specify the input file name by pointing
fileName=? to a valid AVI file.
EXAMPLE: "inputDevive=WDM_CAP", or "inputDevice=AVI_FILE"
DEFAULT: "WDM_CAP" will be selected if you omit this token
videoWidth=? : preferred video width, EXAMPLE: "videoWidth=720"
only applies to WDM_CAP, will be ignored for AVI_FILE
videoHeight=? : preferred video height, EXAMPLE: "videoHeight=576"
only applies to WDM_CAP, will be ignored for AVI_FILE
pixelFormat=? : internal pixel format (see PixelFormat.h for supported types)
PERFORMANCE WARNING: Be sure to match your IDirect3DTexture/OpenGL texture
formats to whatever you specify here, i.e. use
PXtoOpenGL(format), PXtoD3D(format) for creating your
textures! (see PixelFormat.h for details)
EXAMPLE: "pixelFormat=PIXELFORMAT_RGB32"
NOTE: if you omit the pixelFormat=? token, the global
constant (default_PIXELFORMAT, usually PIXELFORMAT_RGB32)
will be selected.
friendlyName=? : only applies to WDM_CAP, will be IGNORED if "inputDevice=WDM_CAP" is not set.
Used to select a preferred WDM device. WILL BE IGNORED IF deviceName=? IS SET.");
(WARNING: WDM "friendly names" are locale-dependent), i.e. try to match substring <?>
with the "friendly names" of enumerated DirectShow WDM wrappers (ksproxy.ax).
EXAMPLE: "friendlyName=Microsoft DV Camera" for IEEE1394 DV devices
"friendlyName=QuickCam" for Logitech QuickCam
deviceName=? : only applies to WDM_CAP, will be IGNORED if "inputDevice=WDM_CAP" is not set.
Used to select a preferred WDM device. WILL ALWAYS WILL OVERRIDE friendlyName=?
i.e. try to match substring <?> with the "device names" of enumerated DirectShow
WDM wrappers (ksproxy.ax).
Device names look like: "@device:*:{860BB310-5D01-11d0-BD3B-00A0C911CE86}
Use GraphEdit (part of the DirectX SDK, under \DXSDK\bin\DxUtils\graphedt.exe)
to figure out your camera's device name.
EXAMPLE: "deviceName=1394#unibrain&fire-i_1.2#4510000061431408
fileName=? : only applies to AVI_FILE, will be IGNORED if "inputDevice=AVI_FILE" is not set.
input file name, if just use a file's name (without its full path), the WIN32
API fuction SearchPath() (Winbase.h) will be used to locate the file.
EXAMPLE: "fileName=C:\Some Directory\Another Directory\Video.AVI"
"fileName=video.AVI" (will succeed if C:\Some Directory\Another Directory\
is: * the application's startup directory
* the current directory
* listed in the PATH environment variable)
ieee1394id=? : only applies to WDM_CAP, will be IGNORED if "inputDevice=WDM_CAP" is not set.
Unique 64-bit device identifier, as defined by IEEE 1394.
Hexadecimal value expected, i.e. "ieee1394id=437d3b0201460008"
Use /bin/IEEE394_id.exe to determine your camera's ID.
deinterlaceState=? : supported parameters (see VFX_STATE_names[])
"off" : disable deinterlacing (DEFAULT)
"on" : force deinterlacing (even for progressive frames)
"auto" : enable deinterlacing only if
(VIDEOINFOHEADER.dwInterlaceFlags & AMINTERLACE_IsInterlaced)
WARNING: EXPERIMENTAL FEATURE!
deinterlaceMethod=? : deinterlacing method (see VFxDeinterlaceParam.h for supported modes)
supported parameters (see DEINTERLACE_METHOD_names[]):
"blend" : blend fields (best quality)
"duplicate1" : duplicate first field
"duplicate2" : duplicate second field
NOTE: omitting this token results in default mode (DEINTERLACE_BLEND) being used.
WARNING: EXPERIMENTAL FEATURE!
EXAMPLES:
arVideoOpen(NULL);
arVideoOpen("inputDevice=WDM_CAP,showDlg");
arVideoOpen("inputDevice=WDM_CAP,flipH,flipV,showDlg");
arVideoOpen("inputDevice=WDM_CAP,pixelFormat=PIXELFORMAT_RGB24,showDlg");
arVideoOpen("inputDevice=WDM_CAP,showDlg,deinterlaceState=on,deinterlaceMethod=duplicate1");
arVideoOpen("inputDevice=WDM_CAP,videoWidth=640,flipH,videoHeight=480,showDlg,deinterlaceState=auto");
arVideoOpen("inputDevice=WDM_CAP,friendlyName=Microsoft DV Camera,videoWidth=720,videoHeight=480");
arVideoOpen("inputDevice=WDM_CAP,friendlyName=Logitech,videoWidth=320,videoHeight=240,flipV");
arVideoOpen("inputDevice=WDM_CAP,friendlyName=Microsoft DV Camera,ieee1394id=437d3b0201460008");
arVideoOpen("inputDevice=AVI_FILE,fileName=C:\Some Directory\Another Directory\Video.AVI");
arVideoOpen("inputDevice=AVI_FILE,fileName=Video.AVI,pixelFormat=PIXELFORMAT_RGB24");