接續上一篇<Basler 3.2 Pylon C API 筆記1>

Grabbing Using Stream Grabber Objects(電腦端非攝影機端)

利用Stream Grabber Objects來擷取影像

1. 取得裝置的串流(nStream)數量:   PylonDeviceGetNumStreamGrabberChannels( hDev, &nStreams );

2. 取得裝置的第一個串流的handle:  PylonDeviceGetStreamGrabber( hDev, 0, &hGrabber );

3. 開啟串流, 類似開檔案fopen():     PylonStreamGrabberOpen( hGrabber )

4. 取得串流對應的wait object:       PylonStreamGrabberGetWaitObject

Grabbing Using Stream Grabber Objects

The following sections describe the use of stream grabber objects. The order of the section reflects the sequence in which a typical grab application will use a stream grabber object.

Getting a Stream Grabber

Stream grabber objects are managed by camera objects. The number of stream grabbers provided by a camera can be determined using the PylonDeviceGetNumStreamGrabberChannels() function. The PylonDeviceGetStreamGrabber() function returns a PYLON_STREAMGRABBER_HANDLE. Prior to retrieving a stream grabber handle, the camera device must have been opened. Please take note of the fact that the value returned from PylonDeviceGetNumStreamGrabberChannels() may be 0, as some camera devices, e.g. Camera Link cameras, have no stream grabber. These cameras can still be parameterized as described, but grabbing is not supported for them. Before use, stream grabbers must be opened by a call to PylonStreamGrabberOpen(). When acquiring images is finished the stream grabber must be closed by a call to PylonStreamGrabberClose().

A stream grabber also provides a wait object for the application to be notified whenever a buffer containing new image data becomes available.

Example:

/* Image grabbing is done using a stream grabber.  
  A device may be able to provide different streams. A separate stream grabber must 
  be used for each stream. In this sample, we create a stream grabber for the default 
  stream, i.e., the first stream ( index == 0 ).
  */

/* Get the number of streams supported by the device and the transport layer. */
res = PylonDeviceGetNumStreamGrabberChannels( hDev, &nStreams );
CHECK(res);
if ( nStreams < 1 )
{
    fprintf( stderr, "The transport layer doesn't support image streams\n");
    PylonTerminate();
    pressEnterToExit();
    exit(EXIT_FAILURE);
}

/* Create and open a stream grabber for the first channel. */ 
res = PylonDeviceGetStreamGrabber( hDev, 0, &hGrabber );
CHECK(res);
res = PylonStreamGrabberOpen( hGrabber );
CHECK(res);

/* Get a handle for the stream grabber's wait object. The wait object
   allows waiting for buffers to be filled with grabbed data. */
res = PylonStreamGrabberGetWaitObject( hGrabber, &hWait );
CHECK(res);
Note:
The lifetime of a stream grabber is managed by the camera owning it. There is no need (and no facility) to dispose off a PYLON_STREAMGRABBER_HANDLE. This also means that, if the camera object owning the stream grabber is deleted by calling PylonDestroyDevice() on it, the related stream grabber handle will become invalid.

----------------------------------------------------------------------------------------------------------------------------------------------------

Configuring a Stream Grabber

設定電腦端Stream Grabber參數設定

1. MaxBufferSize:  每個image buffer的記憶體大小(bytes)

2. MaxNumBuffer: 最多幾張緩衝空間 (image buffer數量)

3. 設定幾塊緩衝區:  PylonStreamGrabberSetMaxNumBuffer( hGrabber, NUM_BUFFERS )

4. 每一塊的緩衝區記憶大小: PylonStreamGrabberSetMaxBufferSize( hGrabber, payloadSize )

5. PylonStreamGrabberPrepareGrab( hGrabber )

6. 緩衝區使用前須先註冊每一塊緩衝區:  PylonStreamGrabberRegisterBuffer( hGrabber, buffers[i], payloadSize, &bufHandles[i] )

7. 釋放緩衝區記憶體前先反註冊: PylonStreamGrabberDeregisterBuffer( hGrabber, bufHandles[i] );

 

Independent of the physical camera interface used, every stream grabber provides two mandatory parameters:

  • MaxBufferSize - Maximum size in bytes of a buffer used for grabbing images
  • MaxNumBuffer - Maximum number of buffers used for grabbing images

A grab application must set the above two parameters before grabbing begins. pylon C provides a set of convenience functions for easily accessing these parameters: PylonStreamGrabberSetMaxNumBuffer(), PylonStreamGrabberGetMaxNumBuffer(), PylonStreamGrabberSetMaxBufferSize() and PylonStreamGrabberGetMaxBufferSize().

Depending on the transport technology, a stream grabber can provide further parameters such as streaming-related timeouts. All these parameters are initially set to reasonable default values, so that grabbing works without having to adjust them. An application can gain access to these parameters using the method described in Generic Parameter Access.

Preparing a Stream Grabber for Grabbing

Depending on the transport layer used for grabbing images, a number of system resources may be required, for example:

  • DMA resources
  • Memory for the driver's data structures
  • Isochronous channel for Firewire cameras
  • Isochronous bandwidth for Firewire cameras

A call to PylonStreamGrabberPrepareGrab() allocates all required resources and causes the camera object to change its state. For a typical camera, any parameters affecting resource requirements (AOI, pixel format, binning, etc.) will be read-only after the call to PylonStreamGrabberPrepareGrab(). These parameters must be set up beforehand and cannot be changed while the camera object is in this state.

Providing Memory for Grabbing

All pylon C transport layers utilize user-provided buffer memory for grabbing image and chunk data. An application is required to register the data buffers it intends to use with the stream grabber by calling PylonStreamGrabberRegisterBuffer() for each data buffer. This is necessary for performance reasons, allowing the stream grabber to prepare and cache internal data structures used to deal with user-provided memory. The call to PylonStreamGrabberRegisterBuffer() returns a handle for the buffer, which is used during later steps.

Example:

/* Allocate memory for grabbing.  */
for ( i = 0; i < NUM_BUFFERS; ++i )          // (image buffer數量)
{
    buffers[i] = (unsigned char*) malloc ( payloadSize );
    if ( NULL == buffers[i] )
    {
        fprintf( stderr, "Out of memory!\n" );
        PylonTerminate();
        pressEnterToExit();
        exit(EXIT_FAILURE);
    }
}

/* We must tell the stream grabber the number and size of the buffers 
    we are using. */
/* .. We will not use more than NUM_BUFFERS for grabbing. */
res = PylonStreamGrabberSetMaxNumBuffer( hGrabber, NUM_BUFFERS );
CHECK(res);
/* .. We will not use buffers bigger than payloadSize bytes. */
res = PylonStreamGrabberSetMaxBufferSize( hGrabber, payloadSize );
CHECK(res);


/*  Allocate the resources required for grabbing. After this, critical parameters 
    that impact the payload size must not be changed until FinishGrab() is called. */
res = PylonStreamGrabberPrepareGrab( hGrabber );
CHECK(res);


/* Before using the buffers for grabbing, they must be registered at
   the stream grabber. For each registered buffer, a buffer handle
   is returned. After registering, these handles are used instead of the
   raw pointers. */
for ( i = 0; i < NUM_BUFFERS; ++i )
{
    res = PylonStreamGrabberRegisterBuffer( hGrabber, buffers[i], payloadSize,  &bufHandles[i] );
    CHECK(res);
}

The buffer registration mechanism transfers ownership of the buffers to the stream grabber. An application must never deallocate the memory belonging to buffers that are still registered. Freeing the memory is not allowed unless the buffers are deregistered by calling PylonStreamGrabberDeregisterBuffer() first.

for ( i = 0; i < NUM_BUFFERS; ++i )   
{
    res = PylonStreamGrabberDeregisterBuffer( hGrabber, bufHandles[i] );
    CHECK(res);
    free( buffers[i] );
}

 

----------------------------------------------------------------------------------------------------------------------------------------------------

Feeding the Stream Grabber's Input Queue

  影像緩衝區(image buffers)-----> 輸入行列(grabber's input queue)---->||----> [ 開始擷取 ] 輸出行列(grabber's output queue) ----||--->應用程式wait object

1. PylonStreamGrabberQueueBuffer( hGrabber, bufHandles[i], (void*) i )

Every stream grabber maintains two different buffer queues, an input queue and an output queue. The buffers to be used for grabbing must be fed to the grabber's input queue. After grabbing, buffers containing image data can be retrieved from the grabber's output queue.

The PylonStreamGrabberQueueBuffer() function is used to append a buffer to the end of the grabber's input queue. It takes two parameters, a buffer handle and an optional pointer to application-specific context information. Along with the data buffer, the context pointer is passed back to the user when retrieving the buffer from the grabber's output queue. The stream grabber does not access the memory to which the context pointer points in any way.

Example:

/* Feed the buffers into the stream grabber's input queue. For each buffer, the API 
   allows passing in a pointer to additional context information. This pointer
   will be returned unchanged when the grab is finished. In our example, we use the index of the 
   buffer as context information. */
for ( i = 0; i < NUM_BUFFERS; ++i )
{
    res = PylonStreamGrabberQueueBuffer( hGrabber, bufHandles[i], (void*) i );
    CHECK(res);
}
Note:
Queuing buffers to a stream grabber's input queue does not start image acquisition. For this to happen, the camera must be programmed as described in Starting and Stopping Image Acquisition.

After buffers have been queued, the stream grabber is ready to grab image data into them, but acquisition must be started explicitly.

 

----------------------------------------------------------------------------------------------------------------------------------------------------

Starting and Stopping Image Acquisition

影像擷取(image acquisition: 攝影機內部產生單張影像的動作

影像資料傳輸(image data transfer): 影像資料從攝影機端搬到電腦端

影像擷取中(image grabbing): 是影像資料寫入電腦端的記憶體

1. 利用AcquisitionStart命令參數, 透過PylonDeviceExecuteCommandFeature()指令執行, 也就是電腦丟訊息(AcquisitionStart)給攝影機, 請攝影機開始送資料囉!!!

2. 單一影格模式(single frame mode)

    連續取像模式(continuous mode)

3. 更精確地說, Acquisition start命令只是準備傳送資料, 必須等到外部觸發(external trigger)或是軟體觸發(software trigger)已致能(enabled)

4. 如果是設定成continuous acquisition mode, 會不斷接收攝影機送過來的資料, 如果要叫攝影機停止送資料可以呼叫AcquisitionStop命令, 也一樣是透過PylonDeviceExecuteCommandFeature()指令執行

To start image acquisition, use the camera's AcquisitionStart parameter. AcquisitionStart is a command parameter, which means that calling PylonDeviceExecuteCommandFeature() for the AcquisitionStart parameter sends an 'acquisition start' command to the camera.

A camera device typically provides two acquisition modes:

  • Single Frame mode where the camera acquires one image and then stops.
  • Continuous mode where the camera continuously acquires and transfers images until acquisition is stopped explicitly.

To be precise, the acquisition start command does not necessarily start acquisition in the camera immediately. If either external triggering or software triggering is enabled, the acquisition start command prepares the camera for image acquisition. Actual acquisition starts when the camera senses an external trigger signal or receives a software trigger command.

When the camera's continuous acquisition mode is enabled, the AcquisitionStop parameter must be used to stop image acquisition.

Normally, a camera starts to transfer image data as soon as possible after acquisition. There is no specific command to start the image transfer.

Example:

/* Let the camera acquire images. */
res = PylonDeviceExecuteCommandFeature( hDev, "AcquisitionStart");
CHECK(res);

----------------------------------------------------------------------------------------------------------------------------------------------------

Retrieving Grabbed Images

1.接收擷取影像(retrieving grabbed images): 透過wait object讓應用程式一直等待, 直到擷取影像資料達output queue(電腦端)或是timeout情況發生(超過一定時間)

2.利用PylonStreamGrabberRetrieveResult()回傳下列資訊(PylonGrabResult_t資料結構), 同時將緩衝區bufferoutput queque中移除,

換句話說, 緩衝區的buffer控制權又從攝影機端交還給電腦端應用程式

  •    擷取狀態(成功, 取消, 失敗)
  •    緩衝區handle
  •    緩衝區pointer
  •    緩衝區索引pointer
  •    AOI(感興趣區域)和影像格式
  •    錯誤次數以及錯誤敘述

3.利用PylonStreamGrabberCancelGrab()會讓攝影機端停止傳送資料,也就是將剩下還沒塞入資料或是正在塞入中的緩衝區bufferinput queue推向ouput queue

Image data is written to the buffer(s) in the stream grabber's input queue. When a buffer is filled with data, the stream grabber places it on its output queue, from which it can then be retrieved by the user application.

There is a wait object associated with every stream grabber's output queue. This wait object allows the application to wait until either a grabbed image arrives at the output queue or a timeout expires.

When the wait operation returns successfully, the grabbed buffer can be retrieved using the PylonStreamGrabberRetrieveResult() function. It uses a PylonGrabResult_t struct to return information about the grab operation:

  • Status of the grab (succeeded, canceled, failed)
  • The buffer's handle
  • The pointer to the buffer
  • The user-provided context pointer
  • AOI and image format
  • Error number and error description if the grab failed

This also removes the buffer from the output queue. Ownership of the buffer is returned to the application. A buffer retrieved from the output queue will not be overwritten with new image data until it is placed on the grabber's input queue again.

Remember, a buffer retrieved from the output queue must be deregistered before its memory can be freed.

Use the buffer handle from the PylonGrabResult_t struct to requeue a buffer to the grabber's input queue.

When the camera ceases to send data, all not yet processed buffers remain in the input queue until the PylonStreamGrabberCancelGrab() function is called. PylonStreamGrabberCancelGrab() puts all buffers from the input queue to the output queue, including any buffer currently being filled. Checking the status of the PylonGrabResult_t struct returned by PylonStreamGrabberRetrieveResult(), allows to determine whether a buffer has been canceled.

The following example shows a typical grab loop:

/* Grab NUM_GRABS images */
nGrabs = 0;                         /* Counts the number of images grabbed */
while ( nGrabs < NUM_GRABS )
{
    int bufferIndex;              /* Index of the buffer */
    unsigned char min, max;
    /* Wait for the next buffer to be filled. Wait up to 1000 ms. */
    res = PylonWaitObjectWait( hWait, 1000, &isReady );
    CHECK(res);
    if ( ! isReady )
    {
        /* Timeout occurred. */
        fprintf(stderr, "Grab timeout occurred\n");
        break; // Stop grabbing.
    }

    /* Since the wait operation was successful, the result of at least one grab 
       operation is available. Retrieve it. */
    res = PylonStreamGrabberRetrieveResult( hGrabber, &grabResult, &isReady );
    CHECK(res);
    if ( ! isReady )
    {
        /* Oops. No grab result available? We should never have reached this point. 
           Since the wait operation above returned without a timeout, a grab result 
           should be available. */
        fprintf(stderr, "Failed to retrieve a grab result\n");
        break;
    }

    nGrabs++;

    /* Get the buffer index from the context information. */
    bufferIndex = (int) grabResult.Context;

    /* Check to see if the image was grabbed successfully. */
    if ( grabResult.Status == Grabbed )
    {
        /*  Success. Perform image processing. Since we passed more than one buffer
        to the stream grabber, the remaining buffers are filled while
        we do the image processing. The processed buffer won't be touched by
        the stream grabber until we pass it back to the stream grabber. */

        unsigned char* buffer;        /* Pointer to the buffer attached to the grab result. */

        /* Get the buffer pointer from the result structure. Since we also got the buffer index, 
           we could alternatively use buffers[bufferIndex]. */
        buffer = (unsigned char*) grabResult.pBuffer;

        /* Perform processing. */
        getMinMax( buffer, grabResult.SizeX, grabResult.SizeY, &min, &max );
        printf("Grabbed frame %2d into buffer %2d. Min. gray value = %3u, Max. gray value = %3u\n", 
            nGrabs, bufferIndex, min, max);

        /* Display image */
        res = PylonImageWindowDisplayImageGrabResult(0, &grabResult);
        CHECK(res);

    }
    else if ( grabResult.Status == Failed )
    {
        fprintf( stderr,  "Frame %d wasn't grabbed successfully.  Error code = 0x%08X\n",
            nGrabs, grabResult.ErrorCode );
    }

    /* Once finished with the processing, requeue the buffer to be filled again. */
    res = PylonStreamGrabberQueueBuffer( hGrabber, grabResult.hBuffer, (void*) bufferIndex );
    CHECK(res);
}

----------------------------------------------------------------------------------------------------------------------------------------------------

Finish Grabbing

結束擷取(Finish Grabbing)

1. 利用PylonDeviceExecuteCommandFeature()傳送AcquisitionStop命令,告訴攝影機停止傳送影像資料

2. 利用PylonStreamGrabberFinishGrab()釋放所有影像擷取資源

3. 最後呼叫PylonStreamGrabberClose(), 結束影像擷取

If the camera is set for continuous acquisition mode, acquisition should first be stopped:

/*  ... Stop the camera. */
res = PylonDeviceExecuteCommandFeature( hDev, "AcquisitionStop");
CHECK(res);

After stopping the camera you must ensure that all buffers waiting in the input queue will be moved to the output queue. You do this by calling the PylonStreamGrabberCancelGrab() function. This will move all pending buffers from the input queue to the output queue and mark them as canceled.

An application should retrieve all buffers from the grabber's output queue before closing a stream grabber. Prior to deallocating their memory, deregister the buffers. After all buffers have been deregistered, call the PylonStreamGrabberFinishGrab() function to release all resources allocated for grabbing. PylonStreamGrabberFinishGrab() must not be called when there are still buffers in the grabber's input queue.

The last step is to close the stream grabber by calling PylonStreamGrabberClose().

Example:

/* ... We must issue a cancel call to ensure that all pending buffers are put into the
   stream grabber's output queue. */
res = PylonStreamGrabberCancelGrab( hGrabber );
CHECK(res);

/* ... The buffers can now be retrieved from the stream grabber. */
do 
{
    res = PylonStreamGrabberRetrieveResult( hGrabber, &grabResult, &isReady );
    CHECK(res);
} while ( isReady );

/* ... When all buffers have been retrieved from the stream grabber, they can be deregistered.
       After that, it is safe to free the memory. */

for ( i = 0; i < NUM_BUFFERS; ++i )   
{
    res = PylonStreamGrabberDeregisterBuffer( hGrabber, bufHandles[i] );
    CHECK(res);
    free( buffers[i] );
}

/* ... Release grabbing related resources. */
res = PylonStreamGrabberFinishGrab( hGrabber );
CHECK(res);

/* After calling PylonStreamGrabberFinishGrab(), parameters that impact the payload size (e.g., 
the AOI width and height parameters) are unlocked and can be modified again. */

/* ... Close the stream grabber. */
res = PylonStreamGrabberClose( hGrabber );
CHECK(res);

----------------------------------------------------------------------------------------------------------------------------------------------------

Sample Program

範例程式

1. 完整GigE連續取像模式請參考OverlappedGrab Sample, 路徑為 pylon C SDK in<SDK ROOT>\Samples\C\OverlappedGrab

A complete sample program for acquiring images with a GigE camera in continuous mode can be found here: OverlappedGrab Sample. The sample program is installed as part of the pylon C SDK in<SDK ROOT>\Samples\C\OverlappedGrab.

Using Wait Objects

Using the PylonWaitObjectWait() and PylonWaitObjectWaitEx() functions, an application can wait for a single wait object to became signaled. This has already been demonstrated as part of the grab loop example presented in Retrieving Grabbed Images. However, it is much more common for an application to wait for events from multiple sources. For this purpose, pylon C defines a wait object container, represented by a PYLON_WAITOBJECTS_HANDLE handle. Wait objects can be added to a container by calling PylonWaitObjectsAdd()or PylonWaitObjectsAddMany(). Once the wait objects are added to a container, an application can wait for the wait objects to become signaled:

Sample Program

範例程式

1. 利用WaitObjectsWaitForAll()可以同時等待來自不同攝影機的緩衝區(buffer)資料, 可以參考ManyCameras範例程式(在SDK 3.2版好像找不到耶!

WaitObjectsWaitForAll() function to simultaneously wait for buffers from two cameras. The code snippets are taken from the ManyCameras sample program installed as part of the pylon C SDK.

The program grabs images from two cameras at the same time. In the grab loop the program waits on both WaitObjects to signal that their buffer has been filled. WaitObjectsWaitForAny() function to simultaneously wait for buffers and a termination request. The snippets are taken from thesample program installed as part of the pylon C SDK.

The program grabs images for 5 seconds and then exits. First, the program creates a wait object container to hold all its wait objects. It then creates a WaitableTimer, which is transformed into a pylon C wait object. The wait object is then added to the container. Note that PylonWaitObjectFromW32() is invoked with the duplicate argument set to 0, which means that ownership of the timer handle is transferred to the wait object, which is now responsible for deleting the handle during cleanup.

/* Create wait objects (must be done outside of the loop). */
res = PylonWaitObjectsCreate(&wos);
CHECK(res);

/* In this sample, we want to grab for a given amount of time, then stop.
Create a Windows timer, wrap it in a pylon C wait object, and add it to
the wait object set. */
hTimer = CreateWaitableTimer(NULL, TRUE, NULL);
if (hTimer == NULL)
    printErrorAndExit(GENAPI_E_FAIL);
res = PylonWaitObjectFromW32(hTimer, 0, &woTimer);
CHECK(res);
res = PylonWaitObjectsAdd(wos, woTimer, NULL);
CHECK(res);

In this code snippet, multiple cameras are used for simultaneous grabbing. Every one of these cameras has a stream grabber, which in turn has a wait object. All these wait objects are added to the container, too. This is achieved by executing the following statements in a loop, once for every camera:

/* Get a handle for the stream grabber's wait object. The wait object
   allows waiting for buffers to be filled with grabbed data. */
res = PylonStreamGrabberGetWaitObject( hGrabber[deviceIndex], &hWait );
CHECK(res);

/* Add the stream grabber's wait object to our wait objects.
   This is needed to be able to wait until at least one camera has 
   grabbed an image in the grab loop below. */
res = PylonWaitObjectsAdd(wos, hWait, NULL);
CHECK(res);

At the beginning of the grab loop, PylonWaitObjectsWaitForAny() is called. The index value returned is used to determine whether a buffer has been grabbed or the timer has expired. This means that the program should stop grabbing and exit:

/* Grab until the timer expires. */
for (;;)
{
    _Bool isReady;
    size_t woidx;
    unsigned char min, max;
    PylonGrabResult_t grabResult;

    /* Wait for the next buffer to be filled. Wait up to 1000 ms. */
    res = PylonWaitObjectsWaitForAny(wos, 1000, &woidx, &isReady );
    CHECK(res);
    if ( !isReady )
    {
        /* Timeout occurred. */
        fputs("Grab timeout occurred.\n", stderr);
        break; // Stop grabbing.
    }

    /* If the timer has expired, exit the grab loop */
    if (woidx == 0) {
        fputs("Game over.\n", stderr);
        break;  /* timer expired */
    }

    /* Account for the timer. */
    --woidx;

    /* Retrieve the grab result. */
    res = PylonStreamGrabberRetrieveResult( hGrabber[woidx], &grabResult, &isReady );
    CHECK(res);
    if ( !isReady )
    {
        /* Oops. No grab result available? We should never have reached this point. 
           Since the wait operation above returned without a timeout, a grab result 
           should be available. */
        fprintf(stderr, "Failed to retrieve a grab result\n");
        break;
    }

    /* Check to see if the image was grabbed successfully. */
    if ( grabResult.Status == Grabbed )
    {
        /*  Success. Perform image processing. Since we passed more than one buffer
        to the stream grabber, the remaining buffers are filled while
        we do the image processing. The processed buffer won't be touched by
        the stream grabber until we pass it back to the stream grabber. */

        /* Pointer to the buffer attached to the grab result
           Get the buffer pointer from the result structure. Since we also got the buffer index, 
           we could alternatively use buffers[bufferIndex]. */
        unsigned char* buffer = (unsigned char*) grabResult.pBuffer;

        /* Perform processing. */
        getMinMax( buffer, grabResult.SizeX, grabResult.SizeY, &min, &max );
        printf("Grabbed frame #%2u from camera %2u into buffer %2d. Min. val=%3u, Max. val=%3u\n", 
               nGrabs, woidx, (int) grabResult.Context, min, max);

        /* Display image */
        res = PylonImageWindowDisplayImageGrabResult(woidx, &grabResult);
        CHECK(res);
    }
    else if ( grabResult.Status == Failed )
    {
        fprintf( stderr,  "Frame %u wasn't grabbed successfully.  Error code = 0x%08X\n",
            nGrabs, grabResult.ErrorCode );
    }

    /* Once finished with the processing, requeue the buffer to be filled again. */
    res = PylonStreamGrabberQueueBuffer( hGrabber[woidx], grabResult.hBuffer, grabResult.Context );
    CHECK(res);

    nGrabs++;
}

Finally, during cleanup the timer wait object is destroyed. This frees the timer handle included within it.

/* Remove all wait objects from waitobjects. */
res = PylonWaitObjectsRemoveAll(wos);
CHECK(res);
res = PylonWaitObjectDestroy(woTimer);
CHECK(res);
res = PylonWaitObjectsDestroy(wos);
CHECK(res);

----------------------------------------------------------------------------------------------------------------------------------------------------

Interruptible Wait Operation

可中斷的等待程序

利用PylonWaitObjectsWaitForAnyEx()偵測作業系統外部觸發事件(如windows環境下Asynchronous Procedure Calls(APC)或是UNIX下的signal

The PylonWaitObjectsWaitForAnyEx() and PylonWaitObjectsWaitForAllEx() functions, as well as PylonWaitObjectWaitEx(), take an additional boolean argument Alertable, that allows the caller to specify whether the wait operation should be interruptible or not. An interruptible wait is terminated prematurely whenever a certain asynchronous system event (a user APC on Windows, or a signal on Unix) happens. This rarely-needed feature has special uses that are beyond the scope of this document.

----------------------------------------------------------------------------------------------------------------------------------------------------

Handling Camera Events

處理攝影機觸發事件

攝影機端可以發送一個觸發訊號告知電腦曝光結束事件(an end-of-exposure event)

Basler GigE Vision, FireWire IEEE 1394, and Camera Link cameras used with Basler pylon software, can send event messages. For example, when a sensor exposure has finished, the camera can send an end-of-exposure event to the computer. The event can be received by the computer before the image data for the finished exposure has been completely transferred. Retrieval and processing of event messages is described in this section.

Event Grabbers

1. 事件擷取器(Event Grabbers)主要是電腦端接收來自攝影機所發出的事件捕捉功能, 類似影像資料接收的方式(stream grabber)

2.PylonDeviceGetEventGrabber(hDev, &hEventGrabber)取得事件擷取器資訊

if ( hEventGrabber == PYLONC_INVALID_HANDLE ) 判斷是否是無效的handle

3. PylonEventGrabberOpen(hEventGrabber) 開始偵測來自攝影機端傳送過來的事件觸發

4. PylonEventGrabberGetWaitObject( hEventGrabber, &hWaitEvent )

Receiving event data sent by a camera is accomplished in much the same way as receiving image data. While the latter involves use of a stream grabber, an event grabber is used for obtaining events.

Getting and Preparing Event Grabbers

Event grabbers can be obtained by PylonDeviceGetEventGrabber().

/* Create and prepare an event grabber. */
/* ... Get a handle for the event grabber. */
res = PylonDeviceGetEventGrabber( hDev, &hEventGrabber );
CHECK(res);
if ( hEventGrabber == PYLONC_INVALID_HANDLE )
{
    /* The transport layer doesn't support event grabbers. */
    fprintf(stderr, "No event grabber supported.\n");
    PylonTerminate();
    pressEnterToExit();
    return EXIT_FAILURE;
}

The camera object owns event grabbers created this way and manages their lifetime.

Unlike stream grabbers, event grabbers use internal memory buffers for receiving event messages. The number of buffers can be parameterized through the PylonEventGrabberSetNumBuffers() function:

/* ... Tell the grabber how many buffers to use. */
res = PylonEventGrabberSetNumBuffers( hEventGrabber, NUM_EVENT_BUFFERS );
CHECK(res);
Note:
The number of buffers must be set before calling PylonEventGrabberOpen().

A connection to the device and all resources required for receiving events are allocated by calling PylonEventGrabberOpen(). After that, a wait object handle can be obtained for the application to be notified of any occurring events.

/* ... Open the event grabber. */
res = PylonEventGrabberOpen( hEventGrabber );  /* The event grabber is now ready
                                               for receiving events. */  
CHECK(res);

/* Retrieve the wait object that is associated with the event grabber. The event 
will be signalled when an event message has been received. */
res = PylonEventGrabberGetWaitObject( hEventGrabber, &hWaitEvent );
CHECK(res);

----------------------------------------------------------------------------------------------------------------------------------------------------

Enabling Events

 

1. 設定要監聽的事件種類, 如曝光結束時觸發可以下指令

    PylonDeviceFeatureFromString( hDev, "EventSelector", "ExposureEnd" );

2. 開始監聽事件:

    PylonDeviceFeatureFromString( hDev, "EventNotification", "GenICamEvent" )

3. 停止監聽事件:

    PylonDeviceFeatureFromString( hDev, "EventNotification", "Off" );

Sending of event messages must be explicitly enabled on the camera by setting its EventSelector parameter to the type of the desired event. In the following example the selector is set to the end-of-exposure event. After this, sending events of the desired type is enabled through the EventNotification parameter:

/* Enable camera event reporting. */
/* ... Select the end-of-exposure event reporting. */
res = PylonDeviceFeatureFromString( hDev, "EventSelector", "ExposureEnd" );
CHECK(res);
/* ... Enable the event reporting. */
res = PylonDeviceFeatureFromString( hDev, "EventNotification", "GenICamEvent" );
CHECK(res);

To be sure that no events are missed, the event grabber should be prepared before event messages are enabled (see the Getting and Preparing Event Grabbers section above).

The following code snippet illustrates how to disable the sending of end-of-exposure events:

/* ... Switch-off the events. */
res = PylonDeviceFeatureFromString( hDev, "EventSelector", "ExposureEnd" );
CHECK(res);
res = PylonDeviceFeatureFromString( hDev, "EventNotification", "Off" );
CHECK(res);

----------------------------------------------------------------------------------------------------------------------------------------------------

Receiving Event Messages

接收事件訊息: 每當有事件訊息發生時, 會主動通知等待物件(wait object)

1. 創建wait Object :利用 PylonWaitObjectsCreate( &hWaitObjects )

2. 加入wait Object的handle: 利用PylonWaitObjectsAddMany( hWaitObjects, 2, hWaitEvent, hWaitStream)

3. 開始擷取影像: PylonDeviceExecuteCommandFeature( hDev, "AcquisitionStart")

Receiving event messages is very similar to grabbing images. The event grabber provides a wait object that is signaled whenever an event message becomes available. When an event message is available, it can be retrieved by calling PylonEventGrabberRetrieveEvent().

In typical applications, waiting for grabbed images and event messages is done in one common loop. This is demonstrated in the following code snippet:

/* Put the wait objects into a container. */
/* ... Create the container. */
res = PylonWaitObjectsCreate( &hWaitObjects );
CHECK(res);
/* ... Add the wait objects' handles. */
res = PylonWaitObjectsAddMany( hWaitObjects, 2, hWaitEvent, hWaitStream);
CHECK(res);

/* Let the camera acquire images. */
res = PylonDeviceExecuteCommandFeature( hDev, "AcquisitionStart");
CHECK(res);

/* Grab NUM_GRABS images. */
nGrabs = 0;                         /* Counts the number of images grabbed. */
while ( nGrabs < NUM_GRABS )
{
    int bufferIndex;              /* Index of the buffer. */
    size_t waitObjectIndex;          /* Index of the wait object that is signalled.*/
    unsigned char min, max;

    /* Wait for either an image buffer grabbed or an event received. Wait up to 1000 ms. */
    res = PylonWaitObjectsWaitForAny( hWaitObjects, 1000, &waitObjectIndex, &isReady );
    CHECK(res);
    if ( ! isReady )
    {
        /* Timeout occurred. */
        fprintf(stderr, "Timeout. Neither grabbed an image nor received an event.\n");
        break; // Stop grabbing.
    }

    if ( 0 == waitObjectIndex )
    {
        PylonEventResult_t eventMsg;
        /* hWaitEvent has been signalled. At least one event message is available. Retrieve it. */
        res = PylonEventGrabberRetrieveEvent( hEventGrabber, &eventMsg, &isReady );
        CHECK(res);
        if ( ! isReady )
        {
            /* Oops. No event message available? We should never have reached this point. 
            Since the wait operation above returned without a timeout, an event message 
            should be available. */
            fprintf(stderr, "Failed to retrieve an event\n");
            break;
        }
        /* Check to see if the event was successfully received. */
        if ( 0 == eventMsg.ErrorCode )
        {
            /* Successfully received an event message. */
            /* Pass the event message to the event adapter. The event adapter will 
            update the parameters related to events and will fire the callbacks
            registered to event related parameters. */
            res = PylonEventAdapterDeliverMessage( hEventAdapter, &eventMsg );
            CHECK(res);
        }
        else
        {
            fprintf(stderr, "Error when receiving an event: 0x%08x\n", eventMsg.ErrorCode );
        }
    }
    else if ( 1 == waitObjectIndex )
    {
        /* hWaitStream has been signalled. The result of at least one grab 
        operation is available. Retrieve it. */
        res = PylonStreamGrabberRetrieveResult( hStreamGrabber, &grabResult, &isReady );
        CHECK(res);
        if ( ! isReady )
        {
            /* Oops. No grab result available? We should never have reached this point. 
            Since the wait operation above returned without a timeout, a grab result 
            should be available. */
            fprintf(stderr, "Failed to retrieve a grab result\n");
            break;
        }

        nGrabs++;

        /* Get the buffer index from the context information. */
        bufferIndex = (int) grabResult.Context;

        /* Check to see if the image was grabbed successfully. */
        if ( grabResult.Status == Grabbed )
        {
            /*  Success. Perform image processing. Since we passed more than one buffer
            to the stream grabber, the remaining buffers are filled while
            we do the image processing. The processed buffer won't be touched by
            the stream grabber until we pass it back to the stream grabber. */

            unsigned char* buffer;        /* Pointer to the buffer attached to the grab result. */

            /* Get the buffer pointer from the result structure. Since we also got the buffer index, 
            we could alternatively use buffers[bufferIndex]. */
            buffer = (unsigned char*) grabResult.pBuffer;


            getMinMax( buffer, grabResult.SizeX, grabResult.SizeY, &min, &max );
            printf("Grabbed frame #%2d into buffer %2d. Min. gray value = %3u, Max. gray value = %3u\n", 
                nGrabs, bufferIndex, min, max);
        }
        else if ( grabResult.Status == Failed )
        {
            fprintf( stderr,  "Frame %d wasn't grabbed successfully.  Error code = 0x%08X\n",
                nGrabs, grabResult.ErrorCode );
        }

        /* Once finished with the processing, requeue the buffer to be filled again. */
        res = PylonStreamGrabberQueueBuffer( hStreamGrabber, grabResult.hBuffer, (void*) bufferIndex );
        CHECK(res);
    }
}

----------------------------------------------------------------------------------------------------------------------------------------------------

Parsing and Dispatching Event Messages

解析及派送事件訊息

While the previous section explained how to receive event messages, this section describes how to interpret them.

The specific layout of event messages depends on the event type and the camera type. The pylon C API uses support from GenICam for parsing event messages. This means that the message layout is described in the camera's XML description file.

As describedin the Generic Parameter Access section,a GenApi node map is created from the camera's XML description file. This node map contains node objects representing the elements of the XML file. Since the layout of event messages is also described in the camera description file, the information carried by the event messages is exposed as nodes in the node map. These can be accessed just like any other node.

For example, an end-of-exposure event carries the following information:

  • ExposureEndEventFrameID: holds an identification number for the image frame that the event is related to
  • ExposureEndEventTimestamp: creation time of the event
  • ExposureEndEventStreamChannelIndex: the number of the image data stream used to transfer the image that the event is related to

An event adapter is used to update the event-related nodes of the camera's node map. Updating the nodes is done by passing the event message to an event adapter.

Event adapters are created by camera objects:

/* For extracting the event data from an event message, an event adapter is used. */
res = PylonDeviceCreateEventAdapter( hDev, &hEventAdapter );
CHECK(res);
if ( hEventAdapter == PYLONC_INVALID_HANDLE )
{
    /* The transport layer doesn't support event grabbers. */
    fprintf(stderr, "No event adapter supported.\n");
    PylonTerminate();
    pressEnterToExit();
    return EXIT_FAILURE;
}

To update any event-related nodes, call PylonEventAdapterDeliverMessage() for every event message received:

PylonEventResult_t eventMsg;
/* hWaitEvent has been signalled. At least one event message is available. Retrieve it. */
res = PylonEventGrabberRetrieveEvent( hEventGrabber, &eventMsg, &isReady );
CHECK(res);
if ( ! isReady )
{
    /* Oops. No event message available? We should never have reached this point. 
    Since the wait operation above returned without a timeout, an event message 
    should be available. */
    fprintf(stderr, "Failed to retrieve an event\n");
    break;
}
/* Check to see if the event was successfully received. */
if ( 0 == eventMsg.ErrorCode )
{
    /* Successfully received an event message. */
    /* Pass the event message to the event adapter. The event adapter will 
    update the parameters related to events and will fire the callbacks
    registered to event related parameters. */
    res = PylonEventAdapterDeliverMessage( hEventAdapter, &eventMsg );
    CHECK(res);
}
else
{
    fprintf(stderr, "Error when receiving an event: 0x%08x\n", eventMsg.ErrorCode );
}

Event Callbacks

事件回呼函式

The previous section described how event adapters are used to push the contents of event messages into a camera object's node map. The PylonEventAdapterDeliverMessage() function updates all nodes related to events contained in the message passed in.

As described in the Getting Notified About Parameter Changes section, it is possible to register callback functions that are called when nodes may have been changed. These callbacks can be used to determine if an event message contains a particular kind of event. For example, to get informed about end-of-exposure events, a callback for one of the end-of-exposure event-related nodes must be installed. The following code snippet illustrates how to install a callback function for the ExposureEndFrameId node:

/* Register the callback function for ExposureEndEventFrameID parameter. */
/*... Get the node map containing all parameters. */
res = PylonDeviceGetNodeMap( hDev, &hNodeMap );
CHECK(res);
/* ... Get the ExposureEndEventFrameID parameter. */ 
res = GenApiNodeMapGetNode( hNodeMap, "ExposureEndEventFrameID", &hNode );
CHECK(res);

if ( GENAPIC_INVALID_HANDLE == hNode )
{
    /* There is no ExposureEndEventFrameID parameter. */
    fprintf( stderr, "There is no ExposureEndEventFrameID parameter.\n");
    PylonTerminate();
    pressEnterToExit();
    return EXIT_FAILURE;
}

/* ... Register the callback function. */
res = GenApiNodeRegisterCallback( hNode, endOfExposureCallback, &hCallback );
CHECK(res);

The registered callback will be called by pylon C from the context of the PylonEventAdapterDeliverMessage() function.

Note:
Since one event message can aggregate multiple events, PylonEventAdapterDeliverMessage() can issue multiple calls to a callback function when multiple events of the same type are contained in the message.
/* Callback will be fired when an event message contains an end-of-exposure event. */
void _stdcall endOfExposureCallback( NODE_HANDLE hNode )
{
    int64_t frame;
    GENAPIC_RESULT res;
    res = GenApiIntegerGetValue( hNode, &frame );
    CHECK(res);

    printf("Got end-of-exposure event. Frame number: %I64d\n", frame );
}

Cleanup

清空

Before closing and destroying the camera object, the event-related objects must be closed as illustrated in the following code snippet:

/* ... Deregister the callback. */
res = GenApiNodeDeregisterCallback( hNode, hCallback );
CHECK(res);

/* ... Close the event grabber.*/
res = PylonEventGrabberClose( hEventGrabber );
CHECK(res);

/* ... Release the event adapter. */
res = PylonDeviceDestroyEventAdapter( hDev, hEventAdapter );
CHECK(res);

Sample Program

範例程式: Events

The code snippets in this chapter are taken from the 'Events' sample program (see Events Sample) installed as part of the pylon C SDK in<SDK ROOT>\Samples\C\Events.

Chunk Parser: Accessing Chunk Features

Basler cameras are capable of sending additional information appended to the image data as chunks of data, such as frame counters, time stamps, and CRC checksums. The information included in the chunk data is presented to an application in the form of parameters that receive their values from the chunk parsing mechanism. This section explains how to enable the chunk features and how to access the chunk data.

Enabling Chunks

Before a feature producing chunk data can be enabled, the camera's chunk mode must be enabled:

/* Before enabling individual chunks, the chunk mode in general must be activated. */
isAvail = PylonDeviceFeatureIsWritable(hDev, "ChunkModeActive") ;
CHECK(res);
if ( ! isAvail )
{
    fprintf( stderr, "The device doesn't support the chunk mode.\n");
    goto exit;
}

/* Activate the chunk mode. */
res = PylonDeviceSetBooleanFeature( hDev, "ChunkModeActive", 1);
CHECK(res);

After having been set to chunk mode, the camera transfers data blocks that are partitioned into a sequence of chunks. The first chunk is always the image data. When chunk features are enabled, the image data chunk is followed by chunks containing the information generated by the chunk features.

Once chunk mode is enabled, chunk features can be enabled:

/* Enable some individual chunks... */

/* ... The frame counter chunk feature. */
/* Is the chunk available? */
isAvail = PylonDeviceFeatureIsAvailable(hDev, "EnumEntry_ChunkSelector_Framecounter");
if ( isAvail )
{
    /* Select the frame counter chunk feature. */
    res = PylonDeviceFeatureFromString( hDev, "ChunkSelector", "Framecounter" );
    CHECK(res);
    /* Can the chunk feature be activated? */
    isAvail = PylonDeviceFeatureIsWritable(hDev, "ChunkEnable");
    if ( isAvail )
    {
        /* Activate the chunk feature. */
        res = PylonDeviceSetBooleanFeature( hDev, "ChunkEnable", 1);
        CHECK(res);
    }
}
/* ... The CRC checksum chunk feature. */
/*  Note: Enabling the CRC chunk feature is not a prerequisite for using
    chunks. Chunks can also be handled when the CRC feature is disabled. */
isAvail = PylonDeviceFeatureIsAvailable(hDev, "EnumEntry_ChunkSelector_PayloadCRC16");
if ( isAvail )
{
    /* Select the CRC chunk feature. */
    res = PylonDeviceFeatureFromString( hDev, "ChunkSelector", "PayloadCRC16" );
    CHECK(res);
    /* Can the chunk feature be activated? */
    isAvail = PylonDeviceFeatureIsWritable(hDev, "ChunkEnable");
    if ( isAvail )
    {
        /* Activate the chunk feature. */
        res = PylonDeviceSetBooleanFeature( hDev, "ChunkEnable", 1);
        CHECK(res);
    }
}

Grabbing Buffers

Grabbing from an image stream with chunks is very similar to grabbing from an image stream without chunks. Memory buffers must be provided that are large enough to store both the image data and the added chunk data.

The camera's PayloadSize parameter reports the necessary buffersize (in bytes):

/* Determine the required size of the grab buffer. Since activating chunks will increase the
   payload size and thus the required buffer size, do this after enabling the chunks. */
res = PylonDeviceGetIntegerFeatureInt32( hDev, "PayloadSize", &payloadSize );
CHECK(res);

/* Allocate memory for grabbing.  */
for ( i = 0; i < NUM_BUFFERS; ++i )
{
    buffers[i] = (unsigned char*) malloc ( payloadSize );
    if ( NULL == buffers[i] )
    {
        fprintf( stderr, "Out of memory.\n" );
        PylonTerminate();
        pressEnterToExit();
        exit(EXIT_FAILURE);
    }
}

/* We must tell the stream grabber the number and size of the buffers 
   we are using. */
/* .. We will not use more than NUM_BUFFERS for grabbing. */
res = PylonStreamGrabberSetMaxNumBuffer( hGrabber, NUM_BUFFERS );
CHECK(res);
/* .. We will not use buffers bigger than payloadSize bytes. */
res = PylonStreamGrabberSetMaxBufferSize( hGrabber, payloadSize );
CHECK(res);

Once the camera has been set to produce chunk data, and data buffers have been set up taking into account the additional buffer space required to hold the chunk data, grabbing works exactly the same as in the 'no chunks' case.

Accessing the Chunk Data

The data block containing the image chunk and the other chunks has a self-descriptive layout. Before accessing the data contained in the appended chunks, the data block must be parsed by a chunk parser.

The camera object is responsible for creating a chunk parser:

/* The data block containing the image chunk and the other chunks has a self-descriptive layout. 
   A chunk parser is used to extract the appended chunk data from the grabbed image frame.
   Create a chunk parser. */
res = PylonDeviceCreateChunkParser( hDev, &hChunkParser );
CHECK(res);
if ( hChunkParser == PYLONC_INVALID_HANDLE )
{
    /* The transport layer doesn't provide a chunk parser. */
    fprintf(stderr, "No chunk parser available.\n");
    goto exit;
}

Once a chunk parser is created, grabbed buffers can be attached to it. When a buffer is attached to a chunk parser, it is parsed and access to its data is provided through camera parameters.

/* Check to see if we really got image data plus chunk data. */
if ( grabResult.PayloadType != PayloadType_ChunkData )
{
    fprintf(stderr, "Received a buffer not containing chunk data?\n");
}
else
{
    /* Process the chunk data. This is done by passing the grabbed image buffer
       to the chunk parser. When the chunk parser has processed the buffer, the chunk 
       data can be accessed in the same manner as "normal" camera parameters. 
       The only exception is the CRC feature. There are dedicated functions for
       checking the CRC checksum. */

    _Bool hasCRC;

    /* Let the parser extract the data. */
    res = PylonChunkParserAttachBuffer( hChunkParser, grabResult.pBuffer, (size_t) grabResult.PayloadSize  );
    CHECK(res);

    /* Check the CRC. */
    res = PylonChunkParserHasCRC( hChunkParser, &hasCRC );
    CHECK(res);
    if ( hasCRC )
    {
        _Bool isOk; 
        res = PylonChunkParserCheckCRC( hChunkParser, &isOk );
        CHECK(res);
        printf("Frame %d contains a CRC checksum. The checksum %s ok.\n", nGrabs, isOk ? "is" : "is not");
    }


    /* Retrieve the frame counter value. */
    /* ... Check the availability. */
    isAvail = PylonDeviceFeatureIsAvailable(hDev, "ChunkFramecounter");
    printf("Frame %d %s a frame counter chunk.\n", nGrabs, isAvail ? "contains" : "doesn't contain" );
    if ( isAvail )
    {
        /* ... Get the value. */
        int64_t counter;
        res = PylonDeviceGetIntegerFeature( hDev, "ChunkFramecounter", &counter );
        CHECK(res);
        printf("Frame counter of frame %d: %I64d.\n", nGrabs, counter );
    }

    /* Retrieve the frame width value. */
    /* ... Check the availability. */
    isAvail = PylonDeviceFeatureIsAvailable(hDev, "ChunkWidth");
    printf("Frame %d %s a width chunk.\n", nGrabs, isAvail ? "contains" : "doesn't contain" );              
    if ( isAvail )
    {
        /* ... Get the value. */
        res = PylonDeviceGetIntegerFeatureInt32( hDev, "ChunkWidth", &chunkWidth );
        CHECK(res);                                     
        printf("Width of frame %d: %d.\n", nGrabs, chunkWidth );
    }

    /* Retrieve the frame height value. */
    /* ... Check the availability. */           
    isAvail = PylonDeviceFeatureIsAvailable(hDev, "ChunkHeight");
    CHECK(res);
    printf("Frame %d %s a height chunk.\n", nGrabs, isAvail ? "contains" : "doesn't contain" );             
    if ( isAvail )
    {
        /* ... Get the value. */
        res = PylonDeviceGetIntegerFeatureInt32( hDev, "ChunkHeight", &chunkHeight );
        CHECK(res);
        printf("Height of frame %d: %d.\n", nGrabs, chunkHeight );
    }                
}

Chunk data integrity may be protected by an optional checksum. To check for its presence, use PylonChunkParserHasCRC().

/* Check the CRC. */
res = PylonChunkParserHasCRC( hChunkParser, &hasCRC );
CHECK(res);
if ( hasCRC )
{
    _Bool isOk; 
    res = PylonChunkParserCheckCRC( hChunkParser, &isOk );
    CHECK(res);
    printf("Frame %d contains a CRC checksum. The checksum %s ok.\n", nGrabs, isOk ? "is" : "is not");
}

Before re-using a buffer for grabbing, the buffer must be detached from the chunk parser.

/* Before requeueing the buffer, you should detach it from the chunk parser. */
res = PylonChunkParserDetachBuffer( hChunkParser );  /* The chunk data in the buffer is now no longer accessible. */
CHECK(res);

After detaching a buffer, the next grabbed buffer can be attached and the included chunk data can be read.

After grabbing is finished, the chunk parser must be deleted:

/* ... Release the chunk parser. */
res = PylonDeviceDestroyChunkParser( hDev, hChunkParser );
CHECK(res);

Sample Program

The code snippets in this chapter are taken from the 'Chunks' sample program (see Chunks Sample) installed as part of the pylon C SDK in<SDK ROOT>\Samples\C\Chunks.

Getting Informed About Device Removal

一旦設備被移除取得通知:

1.PylonDeviceRegisterRemovalCallback( hDev, removalCallbackFunction, &hCb )

2.取得裝置資訊(device information) 利用PylonDeviceGetDeviceInfo( hDevice, &di )

Callback functions can be installed that are called whenever a camera device is removed. As soon as the PylonDeviceOpen() function has been called, callback functions of the PylonDeviceRemCb_t type can be installed for it.

Installing a callback function:

/* Register the callback function. */
res = PylonDeviceRegisterRemovalCallback( hDev, removalCallbackFunction, &hCb );
CHECK(res);

All registered callbacks must be deregistered before calling PylonDeviceClose().

/* ... Deregister the removal callback. */  
res = PylonDeviceDeregisterRemovalCallback( hDev, hCb );
CHECK(res);

This is the actual callback function. It does nothing besides incrementing a counter.

/* The function to be called when the removal of an opened device is detected. */
void _stdcall removalCallbackFunction(PYLON_DEVICE_HANDLE hDevice )
{
    PylonDeviceInfo_t   di;
    GENAPIC_RESULT      res;
    
    /* Print out the name of the device. It is not possible to read the name 
    from the camera since it has been removed. Use the device's device 
    information instead. For accessing the device information, no reading from 
    the device is required. */
    
    /* Retrieve the device information for the removed device. */
    res = PylonDeviceGetDeviceInfo( hDevice, &di );
    CHECK(res);
    
    
    /* Print out the name. */
    printf( "\nCallback function for removal of device %s (%s).", di.FriendlyName, di.FullName );
    
    /* Increment the counter to indicate that the callback has been fired. */
    callbackCounter++;
}

The code snippets in this section are taken from the 'SurpriseRemoval' sample program (see SurpriseRemoval Sample) installed as part of the pylon C SDK in<SDK ROOT>\Samples\C\SurpriseRemoval.

Advanced Topics

Generic Parameter Access

原始參數的取得

For camera configuration and for accessing other parameters, the pylon API uses the technologies defined by the GenICam standard hosted by the European Machine Vision Association (EMVA). The GenICam specification (http://www.GenICam.org) defines a format for camera description files. These files describe the configuration interface of GenICam compliant cameras. The description files are written in XML (eXtensible Markup Language) and describe camera registers, their interdependencies, and all other information needed to access high-level features such as Gain, Exposure Time, or Image Format by means of low-level register read and write operations.

The elements of a camera description file are represented as software objects called nodes. For example, a node can represent a single camera register, a camera parameter such as Gain, a set of available parameter values, etc. Nodes are represented as handles of the NODE_HANDLE type.

Nodes are linked together by different relationships as explained in the GenICam standard document available at www.GenICam.org. The complete set of nodes is stored in a data structure called a node map. At runtime, a node map is instantiated from an XML description, which may exist as a disk file on the computer connected to a camera, or may be read from the camera itself. Node map objects are represented by handles of the NODEMAP_HANDLE type.

Every node has a name, which is a text string. Node names are unique within a node map, and any node can be looked up by its name. All parameter access functions presented so far are actually shortcuts that get a node map handle from an object, look up a node that implements a named parameter, and finally perform the desired action on the node, such as assigning a new value, for example. The sample code below demonstrates how to look up a parameter node with a known name. If no such node exists, GenApiNodeMapGetNode() returns an invalid handle. This case needs to be handled by the program like in the sample code below, but a real program may want to handle this case differently.

/* Look up the feature node */
res = GenApiNodeMapGetNode(hNodeMap, featureName, &hNode);
CHECK(res);
if (GENAPIC_INVALID_HANDLE == hNode)
{
    fprintf(stderr, "There is no feature named '%s'\n", featureName);
    exit(EXIT_FAILURE);
}

Nodes are generally grouped into categories, which themselves are represented as nodes of the Category type. A category node is an abstraction for a certain functional aspect of a camera, and all parameter nodes grouped under it are related to this aspect. For example, the 'AOI Controls' category might contain an 'X Offset, a 'Y Offset', a 'Width', and a 'Height' parameter node. The topological structure of a node map is that of a tree, with parameter nodes as leaves and category nodes as junctions. The sample code below traverses the tree, displaying every node found:

/* Traverse the feature tree, displaying all categories and all features. */
static void
handleCategory(NODE_HANDLE hRoot, char * buf, unsigned int depth)
{
    GENAPIC_RESULT      res;
    size_t              bufsiz, siz, numfeat, i;

    /* Write out node name. */
    siz = bufsiz = STRING_BUFFER_SIZE - depth * 2;
    res = GenApiNodeGetName(hRoot, buf, &siz);
    CHECK(res);

    /* Get the number of feature nodes in this category. */
    res = GenApiCategoryGetNumFeatures(hRoot, &numfeat);
    CHECK(res);

    printf("%s category has %u children\n", buf - depth * 2, numfeat);


    /* Increase indentation. */
    *buf++ = ' ';
    *buf++ = ' ';
    bufsiz -= 2;
    ++depth;

    /* Now loop over all feature nodes. */
    for (i = 0; i < numfeat; ++i)
    {
        NODE_HANDLE         hNode;
        EGenApiNodeType     nodeType;

        /* Get next feature node and check its type. */
        res = GenApiCategoryGetFeatureByIndex(hRoot, i, &hNode);
        CHECK(res);
        res = GenApiNodeGetType(hNode, &nodeType);
        CHECK(res);

        if (Category != nodeType)
        {
            /* A regular feature. */
            EGenApiAccessMode am;
            const char *amode;

            siz = bufsiz;
            res = GenApiNodeGetName(hNode, buf, &siz);
            CHECK(res);
            res = GenApiNodeGetAccessMode(hNode, &am);
            CHECK(res);

            switch (am)
            {
            case NI:
                amode = "not implemented";
                break;
            case NA:
                amode = "not available";
                break;
            case WO:
                amode = "write only";
                break;
            case RO:
                amode = "read only";
                break;
            case RW:
                amode = "read and write";
                break;
            default:
                amode = "undefined";
                break;
            }

            printf("%s feature - access: %s\n", buf - depth * 2, amode);
        }
        else
            /* Another category node. */
            handleCategory(hNode, buf, depth);
    }
}

static void
demonstrateCategory(PYLON_DEVICE_HANDLE hDev)
{
    NODEMAP_HANDLE      hNodeMap;
    NODE_HANDLE         hNode;
    char                buf[512];
    GENAPIC_RESULT      res; 

    /* Get a handle for the device's node map. */
    res = PylonDeviceGetNodeMap(hDev, &hNodeMap);
    CHECK(res);

    /* Look up the root node. */
    res = GenApiNodeMapGetNode(hNodeMap, "Root", &hNode);
    CHECK(res);

    handleCategory(hNode, buf, 0);
}

In order to access a parameters value, a handle for the corresponding parameter node must be obtained first, as demonstrated in the example below for an integer feature:

/* This function demonstrates how to handle integer camera parameters. */
static void
demonstrateIntFeature(PYLON_DEVICE_HANDLE hDev)
{
    NODEMAP_HANDLE      hNodeMap;
    NODE_HANDLE         hNode;
    static const char   featureName[] = "Width";  /* Name of the feature used in this sample: AOI Width. */
    int64_t             val, min, max, incr;      /* Properties of the feature. */
    GENAPIC_RESULT      res;                      /* Return value. */ 
    EGenApiNodeType     nodeType;
    _Bool                bval;

    /* Get a handle for the device's node map. */
    res = PylonDeviceGetNodeMap(hDev, &hNodeMap);
    CHECK(res);

    /* Look up the feature node */
    res = GenApiNodeMapGetNode(hNodeMap, featureName, &hNode);
    CHECK(res);
    if (GENAPIC_INVALID_HANDLE == hNode)
    {
        fprintf(stderr, "There is no feature named '%s'\n", featureName);
        exit(EXIT_FAILURE);
    }

    /* We want an integer feature node. */
    res = GenApiNodeGetType(hNode, &nodeType);
    CHECK(res);
    if (IntegerNode != nodeType)
    {
        fprintf(stderr, "'%s' is not an integer feature\n", featureName);
        exit(EXIT_FAILURE);
    }

    /* 
       Query the current value, the range of allowed values, and the increment of the feature. 
       For some integer features, you are not allowed to set every value within the 
       value range. For example, for some cameras the Width parameter must be a multiple 
       of 2. These constraints are expressed by the increment value. Valid values 
       follow the rule: val >= min && val <= max && val == min + n * inc.
    */

    res = GenApiNodeIsReadable(hNode, &bval);
    CHECK(res);

    if (bval)
    {
        res = GenApiIntegerGetMin(hNode, &min);       /* Get the minimum value. */
        CHECK(res); 
        res = GenApiIntegerGetMax(hNode, &max);       /* Get the maximum value. */
        CHECK(res);
        res = GenApiIntegerGetInc(hNode, &incr);       /* Get the increment value. */
        CHECK(res);
        res = GenApiIntegerGetValue(hNode, &val);     /* Get the current value. */
        CHECK(res);

#if __STDC_VERSION__ >= 199901L
        printf("%s: min= %lld  max= %lld  incr=%lld  Value=%lld\n", featureName, min, max, incr, val);
#else
        printf("%s: min= %I64d  max= %I64d  incr=%I64d  Value=%I64d\n", featureName, min, max, incr, val);
#endif

        res = GenApiNodeIsWritable(hNode, &bval);
        CHECK(res);

        if (bval)
        {
            /* Set the Width half-way between minimum and maximum. */
            res = GenApiIntegerSetValue(hNode, min + (max - min) / incr / 2 * incr);
            CHECK(res);
        }
        else
            fprintf(stderr, "Cannot set value for feature '%s' - node not writable\n", featureName);
    }
    else
        fprintf(stderr, "Cannot read feature '%s' - node not readable\n", featureName);
}

So far, only camera node maps have been considered. However, there are more objects that expose parameters through node maps:

  • The PylonDeviceGetTLNodeMap() function returns the node map for a device's transport layer.
  • The PylonStreamGrabberGetNodeMap() function is used to access a stream grabber's parameters.
  • The PylonEventGrabberGetNodeMap() function is used to access an event grabber's parameters. Parameter access works identical for all types of node maps, and the same set of functions is used as for camera node maps. It should be noted, however, that the objects listed above, transport layers in particular, may not have any parameters at all. In this case, a call to the corresponding function would return GENAPIC_INVALID_HANDLE. Currently, this is true for the IEEE 1394 transport layer.

Browsing Parameters

The pylon Viewer tool provides an easy way of browsing camera parameters, their names, values, and ranges. Besides grabbing images (not available for Camera Link cameras ) it is capable of displaying all node maps for a camera device, and all parameter nodes contained therein. The pylon Viewer tool has a Features window that displays a tree view of node maps, categories, and parameter nodes. Selecting a node in this view opens a dialog that displays the node's current value (if applicable), and may also allow to change it, subject to accessibility. There is also a Feature Documentation window, located at the very bottom of the display unless the layout was changed from the standard layout. The Feature Documentation window displays detailed information about the currently selected node.

Getting Notified About Parameter Changes

取得參數被更改的通知

The pylon C API provides the functionality for installing callback functions that will be called when a parameter's value or state (e.g. the access mode or value range) was changed.

Every callback is installed for a specific parameter. If the parameter itself has been touched or if another parameter that could possibly influence the state of the parameter has been changed, the callback will be invoked.

The example below illustrates how to find a parameter node and register a callback:

/* Register the callback function for ExposureEndEventFrameID parameter. */
/*... Get the node map containing all parameters. */
res = PylonDeviceGetNodeMap( hDev, &hNodeMap );
CHECK(res);
/* ... Get the ExposureEndEventFrameID parameter. */ 
res = GenApiNodeMapGetNode( hNodeMap, "ExposureEndEventFrameID", &hNode );
CHECK(res);

if ( GENAPIC_INVALID_HANDLE == hNode )
{
    /* There is no ExposureEndEventFrameID parameter. */
    fprintf( stderr, "There is no ExposureEndEventFrameID parameter.\n");
    PylonTerminate();
    pressEnterToExit();
    return EXIT_FAILURE;
}

/* ... Register the callback function. */
res = GenApiNodeRegisterCallback( hNode, endOfExposureCallback, &hCallback );
CHECK(res);

As an optimization, nodes that can only change their values as a direct result of some user action (an application writing a new value) can have their values cached on the computer to speed up read access. Other nodes can change their values asynchronously, e. g. as a result of some operation performed by a camera internally. These nodes obviously cannot be cached. An application should call the function GenApiNodeMapPoll() at regular intervals. This results in the values of non-cachable nodes being updated in the node map, which in turn may cause callbacks to be executed as explained above.

Multicast/Broadcast: Grab Images of One Camera on Multiple PCs

廣播功能: 從一台攝影機抓影像傳送至多台電腦:請參考pylon C++ Programmer's Guide

Basler GigE cameras can be set to send the image data stream to multiple destinations. More information on this subject can be found in the pylon C++ Programmer's Guide.

全站熱搜
創作者介紹
創作者 me1237guy 的頭像
me1237guy

天天向上

me1237guy 發表在 痞客邦 留言(1) 人氣()