Media Foundation/V4L2 grabber ... (#1119)

* - New Media Foundation grabber
- JsonAPI available grabber fix
- commented json config removed

* Added libjpeg-turbo to dependencies

* Fix OSX build
Removed Azure Pipelines from build scripts

* Remove Platform from Dashboard

* Correct Grabber Namings

* Grabber UI improvements, generic JSONEditor Selection Update

* Active grabber fix

* Stop Framebuffer grabber on failure

* - Image format NV12 and I420 added
- Flip mode
- Scaling factor for MJPEG
- VSCode (compile before run)
- CI (push) dependency libjpeg-turbo added

* Refactor MediaFoundation (Part 1)

* Remove QDebug output

* Added image flipping ability to MF Grabber

* fix issue 1160

* -Reload MF Grabber only once per WebUI update
- Cleanup

* Improvements

* - Set 'Software Frame Decimation' begin to 0
- Removed grabber specific device name from Log
- Keep pixel format when switching resolution
- Display 'Flip mode' correct in Log
- BGR24 images always flipped

* Refactor MediaFoundation (Part 2)

* Refactor V4L2 grabber (part 1) (#62)

* Media Foundation grabber adapted to V4L2 change

* Enable Media Foundation grabber on windows

* Have fps as int, fix height typo

* Added video standards to JsonAPI output

* Error handling in source reader improved

* Fix "Frame to small" error

* Discovery VideoSources and Dynamically Update Editor

* Hide all element when no video grabber discovered, upate naming

* Do not show unsupported grabbers

* Copy Log to Clipboard

* Update Grabber schema and Defaults

* Update access levels and validate crop ranges

* Height and width in Qt grabber corrected

* Correct formatting

* Untabify

* Global component states across instances

* Components divided on the dashboard

* refactor

* Fix Merge-issues

* Database migration aligning with updated grabber model

* Align Grabber.js with new utility functions

* Allow editor-validation for enum-lists

* Handle "Show Explainations scenario" correctly

* Grabber - Ensure save is only possible on valid content

* Dashboard update + fix GlobalSignal connection

* Ensure default database is populated with current release

* Correct grabber4L2 access level

* Display Signal detection area in preview

* Write Hyperion version into default config on compiling.

* Create defaultconfig.json dynamically

* WebUI changes

* Correct grabber config look-ups

* Refactor i18n language loading

* Fix en.json

* Split global capture from instance capture config

* Update grabber default values

* Standalone grabber: Add --debug switch

* Enhance showInputOptionsForKey for multiple keys

* Add grabber instance link to system grabber config

* Only show signal detection area, if grabber is enabled

* Always show Active element on grabber page

* Remote control - Only display gabber status, if global grabber is enabled

* WebUI optimization (thx to @mkcologne)
Start Grabber only when global settings are enabled
Fixed an issue in the WebUI preview

* V4L2/MF changes

* Jsoneditor, Correct translation for default values

* Refactor LED-Device handling in UI and make element naming consistent

* MF Discovery extended

* Fix LGTM finding

* Support Grabber Bri, Hue, Sat and Con in UI, plus their defaults

* Concider Access level for item filtering

* Concider Access level for item filtering

* Revert "Concider Access level for item filtering"

This reverts commit 5b0ce3c0f2.

* Disable fpsSoftwareDecimation for framegrabber, as not supported yet

* JSON-Editor- Add updated schema for validation on dynamic elements

* added V4L2 color IDs

* LGTM findings fix

* destroy SR callback only on exit

* Grabber.js - Hide elements not supported by platform

* Fixed freezing start effect

* Grabber UI - Hardware controls - Show current values and allow to reset to defaults

* Grabber - Discovery - Add current values to properties

* Small things

* Clean-up Effects and have ENDLESS consistently defined

* Fix on/off/on priority during startup, by initializing _prevVisComp in line with background priority

* Add missing translation mappings

* DirectX Grabber reactivated/ QT Grabber size decimation fixed

* typo in push-master workflow

* Use PreciseTimer for Grabber to ensure stable FPS timing

* Set default Screencapture rate consistently

* Fix libjpeg-turbo download

* Remove Zero character from file

* docker-compile Add PLATFORM parameter, only copy output file after successful compile

* Framebuffer, Dispmanx, OSX, AML Grabber discovery, various clean-up and consistencies across grabbers

* Fix merge problem - on docker-compile Add PLATFORM parameter, only copy output file after successful compile

* Fix definition

* OSXFRameGrabber - Revert cast

* Clean-ups nach Feedback

* Disable certain libraries when building armlogic via standard stretch image as developer

* Add CEC availability to ServerInfo to have it platform independent

* Grabber UI - Fix problem that crop values are not populated when refining editor rage

* Preserve value when updating json-editor range

* LEDVisualisation - Clear image when source changes

* Fix - Preserve value when updating json-editor range

* LEDVisualisation - Clear image when no component is active

* Allow to have password handled by Password-Manager (#1263)

* Update default signal detection area to green assuming rainbow grabber

* LED Visualisation - Handle empty priority update

* Fix yuv420 in v4l2 grabber

* V4L2-Grabber discovery - Only report grabbers with valid video input information

* Grabber - Update static variables to have them working in release build

* LED Visualisation - ClearImage when no priorities

* LED Visualisation - Fix Logo resizing issue

* LED Visualisation - Have nearly black background and negative logo

Co-authored-by: LordGrey <lordgrey.emmel@gmail.com>
Co-authored-by: LordGrey <48840279+Lord-Grey@users.noreply.github.com>
This commit is contained in:
Markus
2021-07-14 20:48:33 +02:00
committed by GitHub
parent b0e1510a78
commit c135d91986
163 changed files with 10756 additions and 5953 deletions

View File

@@ -1,3 +1,12 @@
# Find the BCM-package (VC control)
IF ( "${PLATFORM}" MATCHES rpi)
find_package(BCM REQUIRED)
include_directories(${BCM_INCLUDE_DIRS})
ELSE()
SET(BCM_INCLUDE_DIRS "")
SET(BCM_LIBRARIES "")
ENDIF()
# Define the current source locations
SET(CURRENT_HEADER_DIR ${CMAKE_SOURCE_DIR}/include/api)
@@ -12,6 +21,11 @@ add_library(hyperion-api
${Api_RESOURCES}
)
if(ENABLE_DX)
include_directories(${DIRECTX9_INCLUDE_DIRS})
target_link_libraries(hyperion-api ${DIRECTX9_LIBRARIES})
endif(ENABLE_DX)
target_link_libraries(hyperion-api
hyperion
hyperion-utils

View File

@@ -0,0 +1,28 @@
{
"type":"object",
"required":true,
"properties": {
"command": {
"type": "string",
"required": true,
"enum": [ "inputsource" ]
},
"tan": {
"type": "integer"
},
"subcommand": {
"type": "string",
"required": true,
"enum": [ "discover", "getProperties" ]
},
"sourceType": {
"type": "string",
"required": true
},
"params": {
"type": "object",
"required": false
}
},
"additionalProperties": false
}

View File

@@ -5,7 +5,7 @@
"command": {
"type" : "string",
"required" : true,
"enum" : ["color", "image", "effect", "create-effect", "delete-effect", "serverinfo", "clear", "clearall", "adjustment", "sourceselect", "config", "componentstate", "ledcolors", "logging", "processing", "sysinfo", "videomode", "authorize", "instance", "leddevice", "transform", "correction" , "temperature"]
"enum": [ "color", "image", "effect", "create-effect", "delete-effect", "serverinfo", "clear", "clearall", "adjustment", "sourceselect", "config", "componentstate", "ledcolors", "logging", "processing", "sysinfo", "videomode", "authorize", "instance", "leddevice", "inputsource", "transform", "correction", "temperature" ]
}
}
}

View File

@@ -20,7 +20,8 @@
<file alias="schema-videomode">JSONRPC_schema/schema-videomode.json</file>
<file alias="schema-authorize">JSONRPC_schema/schema-authorize.json</file>
<file alias="schema-instance">JSONRPC_schema/schema-instance.json</file>
<file alias="schema-leddevice">JSONRPC_schema/schema-leddevice.json</file>
<file alias="schema-leddevice">JSONRPC_schema/schema-leddevice.json</file>
<file alias="schema-inputsource">JSONRPC_schema/schema-inputsource.json</file>
<!-- The following schemas are derecated but used to ensure backward compatibility with hyperion Classic remote control-->
<file alias="schema-transform">JSONRPC_schema/schema-hyperion-classic.json</file>
<file alias="schema-correction">JSONRPC_schema/schema-hyperion-classic.json</file>

View File

@@ -16,7 +16,45 @@
#include <leddevice/LedDevice.h>
#include <leddevice/LedDeviceFactory.h>
#include <HyperionConfig.h> // Required to determine the cmake options
#include <hyperion/GrabberWrapper.h>
#include <grabber/QtGrabber.h>
#if defined(ENABLE_MF)
#include <grabber/MFGrabber.h>
#elif defined(ENABLE_V4L2)
#include <grabber/V4L2Grabber.h>
#endif
#if defined(ENABLE_X11)
#include <grabber/X11Grabber.h>
#endif
#if defined(ENABLE_XCB)
#include <grabber/XcbGrabber.h>
#endif
#if defined(ENABLE_DX)
#include <grabber/DirectXGrabber.h>
#endif
#if defined(ENABLE_FB)
#include <grabber/FramebufferFrameGrabber.h>
#endif
#if defined(ENABLE_DISPMANX)
#include <grabber/DispmanxFrameGrabber.h>
#endif
#if defined(ENABLE_AMLOGIC)
#include <grabber/AmlogicGrabber.h>
#endif
#if defined(ENABLE_OSX)
#include <grabber/OsxFrameGrabber.h>
#endif
#include <utils/jsonschema/QJsonFactory.h>
#include <utils/jsonschema/QJsonSchemaChecker.h>
#include <HyperionConfig.h>
@@ -41,7 +79,10 @@
using namespace hyperion;
JsonAPI::JsonAPI(QString peerAddress, Logger* log, bool localConnection, QObject* parent, bool noListener)
// Constants
namespace { const bool verbose = false; }
JsonAPI::JsonAPI(QString peerAddress, Logger *log, bool localConnection, QObject *parent, bool noListener)
: API(log, localConnection, parent)
{
_noListener = noListener;
@@ -86,7 +127,7 @@ bool JsonAPI::handleInstanceSwitch(quint8 inst, bool forced)
return false;
}
void JsonAPI::handleMessage(const QString& messageString, const QString& httpAuthHeader)
void JsonAPI::handleMessage(const QString &messageString, const QString &httpAuthHeader)
{
const QString ident = "JsonRpc@" + _peerAddress;
QJsonObject message;
@@ -174,6 +215,8 @@ proceed:
handleInstanceCommand(message, command, tan);
else if (command == "leddevice")
handleLedDeviceCommand(message, command, tan);
else if (command == "inputsource")
handleInputSourceCommand(message, command, tan);
// BEGIN | The following commands are deprecated but used to ensure backward compatibility with hyperion Classic remote control
else if (command == "clearall")
@@ -187,17 +230,17 @@ proceed:
handleNotImplemented(command, tan);
}
void JsonAPI::handleColorCommand(const QJsonObject& message, const QString& command, int tan)
void JsonAPI::handleColorCommand(const QJsonObject &message, const QString &command, int tan)
{
emit forwardJsonMessage(message);
int priority = message["priority"].toInt();
int duration = message["duration"].toInt(-1);
const QString origin = message["origin"].toString("JsonRpc") + "@" + _peerAddress;
const QJsonArray& jsonColor = message["color"].toArray();
const QJsonArray &jsonColor = message["color"].toArray();
std::vector<uint8_t> colors;
// TODO faster copy
for (const auto& entry : jsonColor)
for (const auto &entry : jsonColor)
{
colors.emplace_back(uint8_t(entry.toInt()));
}
@@ -206,7 +249,7 @@ void JsonAPI::handleColorCommand(const QJsonObject& message, const QString& comm
sendSuccessReply(command, tan);
}
void JsonAPI::handleImageCommand(const QJsonObject& message, const QString& command, int tan)
void JsonAPI::handleImageCommand(const QJsonObject &message, const QString &command, int tan)
{
emit forwardJsonMessage(message);
@@ -230,7 +273,7 @@ void JsonAPI::handleImageCommand(const QJsonObject& message, const QString& comm
sendSuccessReply(command, tan);
}
void JsonAPI::handleEffectCommand(const QJsonObject& message, const QString& command, int tan)
void JsonAPI::handleEffectCommand(const QJsonObject &message, const QString &command, int tan)
{
emit forwardJsonMessage(message);
@@ -249,19 +292,19 @@ void JsonAPI::handleEffectCommand(const QJsonObject& message, const QString& com
sendErrorReply("Effect '" + dat.effectName + "' not found", command, tan);
}
void JsonAPI::handleCreateEffectCommand(const QJsonObject& message, const QString& command, int tan)
void JsonAPI::handleCreateEffectCommand(const QJsonObject &message, const QString &command, int tan)
{
const QString resultMsg = API::saveEffect(message);
resultMsg.isEmpty() ? sendSuccessReply(command, tan) : sendErrorReply(resultMsg, command, tan);
}
void JsonAPI::handleDeleteEffectCommand(const QJsonObject& message, const QString& command, int tan)
void JsonAPI::handleDeleteEffectCommand(const QJsonObject &message, const QString &command, int tan)
{
const QString res = API::deleteEffect(message["name"].toString());
res.isEmpty() ? sendSuccessReply(command, tan) : sendErrorReply(res, command, tan);
}
void JsonAPI::handleSysInfoCommand(const QJsonObject&, const QString& command, int tan)
void JsonAPI::handleSysInfoCommand(const QJsonObject &, const QString &command, int tan)
{
// create result
QJsonObject result;
@@ -304,7 +347,7 @@ void JsonAPI::handleSysInfoCommand(const QJsonObject&, const QString& command, i
emit callbackMessage(result);
}
void JsonAPI::handleServerInfoCommand(const QJsonObject& message, const QString& command, int tan)
void JsonAPI::handleServerInfoCommand(const QJsonObject &message, const QString &command, int tan)
{
QJsonObject info;
@@ -315,9 +358,9 @@ void JsonAPI::handleServerInfoCommand(const QJsonObject& message, const QString&
activePriorities.removeAll(255);
int currentPriority = _hyperion->getCurrentPriority();
for (int priority : activePriorities)
for(int priority : activePriorities)
{
const Hyperion::InputInfo& priorityInfo = _hyperion->getPriorityInfo(priority);
const Hyperion::InputInfo &priorityInfo = _hyperion->getPriorityInfo(priority);
QJsonObject item;
item["priority"] = priority;
if (priorityInfo.timeoutTime_ms > 0)
@@ -349,9 +392,9 @@ void JsonAPI::handleServerInfoCommand(const QJsonObject& message, const QString&
// add HSL Value to Array
QJsonArray HSLValue;
ColorSys::rgb2hsl(priorityInfo.ledColors.begin()->red,
priorityInfo.ledColors.begin()->green,
priorityInfo.ledColors.begin()->blue,
Hue, Saturation, Luminace);
priorityInfo.ledColors.begin()->green,
priorityInfo.ledColors.begin()->blue,
Hue, Saturation, Luminace);
HSLValue.append(Hue);
HSLValue.append(Saturation);
@@ -362,8 +405,8 @@ void JsonAPI::handleServerInfoCommand(const QJsonObject& message, const QString&
}
(priority == currentPriority)
? priorities.prepend(item)
: priorities.append(item);
? priorities.prepend(item)
: priorities.append(item);
}
info["priorities"] = priorities;
@@ -371,9 +414,9 @@ void JsonAPI::handleServerInfoCommand(const QJsonObject& message, const QString&
// collect adjustment information
QJsonArray adjustmentArray;
for (const QString& adjustmentId : _hyperion->getAdjustmentIds())
for (const QString &adjustmentId : _hyperion->getAdjustmentIds())
{
const ColorAdjustment* colorAdjustment = _hyperion->getAdjustment(adjustmentId);
const ColorAdjustment *colorAdjustment = _hyperion->getAdjustment(adjustmentId);
if (colorAdjustment == nullptr)
{
Error(_log, "Incorrect color adjustment id: %s", QSTRING_CSTR(adjustmentId));
@@ -440,8 +483,8 @@ void JsonAPI::handleServerInfoCommand(const QJsonObject& message, const QString&
// collect effect info
QJsonArray effects;
const std::list<EffectDefinition>& effectsDefinitions = _hyperion->getEffects();
for (const EffectDefinition& effectDefinition : effectsDefinitions)
const std::list<EffectDefinition> &effectsDefinitions = _hyperion->getEffects();
for (const EffectDefinition &effectDefinition : effectsDefinitions)
{
QJsonObject effect;
effect["name"] = effectDefinition.name;
@@ -467,11 +510,18 @@ void JsonAPI::handleServerInfoCommand(const QJsonObject& message, const QString&
QJsonObject grabbers;
QJsonArray availableGrabbers;
#if defined(ENABLE_DISPMANX) || defined(ENABLE_V4L2) || defined(ENABLE_FB) || defined(ENABLE_AMLOGIC) || defined(ENABLE_OSX) || defined(ENABLE_X11) || defined(ENABLE_XCB) || defined(ENABLE_QT)
#if defined(ENABLE_DISPMANX) || defined(ENABLE_V4L2) || defined(ENABLE_MF) || defined(ENABLE_FB) || defined(ENABLE_AMLOGIC) || defined(ENABLE_OSX) || defined(ENABLE_X11) || defined(ENABLE_XCB) || defined(ENABLE_QT)
if (GrabberWrapper::getInstance() != nullptr)
if ( GrabberWrapper::getInstance() != nullptr )
{
grabbers["active"] = GrabberWrapper::getInstance()->getActive();
QStringList activeGrabbers = GrabberWrapper::getInstance()->getActive(_hyperion->getInstanceIndex());
QJsonArray activeGrabberNames;
for (auto grabberName : activeGrabbers)
{
activeGrabberNames.append(grabberName);
}
grabbers["active"] = activeGrabberNames;
}
// get available grabbers
@@ -480,55 +530,20 @@ void JsonAPI::handleServerInfoCommand(const QJsonObject& message, const QString&
availableGrabbers.append(grabber);
}
#endif
#if defined(ENABLE_V4L2)
QJsonArray availableV4L2devices;
for (const auto& devicePath : GrabberWrapper::getInstance()->getV4L2devices())
{
QJsonObject device;
device["device"] = devicePath;
device["name"] = GrabberWrapper::getInstance()->getV4L2deviceName(devicePath);
QJsonArray availableInputs;
QMultiMap<QString, int> inputs = GrabberWrapper::getInstance()->getV4L2deviceInputs(devicePath);
for (auto input = inputs.begin(); input != inputs.end(); input++)
{
QJsonObject availableInput;
availableInput["inputName"] = input.key();
availableInput["inputIndex"] = input.value();
availableInputs.append(availableInput);
}
device.insert("inputs", availableInputs);
QJsonArray availableResolutions;
QStringList resolutions = GrabberWrapper::getInstance()->getResolutions(devicePath);
for (auto resolution : resolutions)
{
availableResolutions.append(resolution);
}
device.insert("resolutions", availableResolutions);
QJsonArray availableFramerates;
QStringList framerates = GrabberWrapper::getInstance()->getFramerates(devicePath);
for (auto framerate : framerates)
{
availableFramerates.append(framerate);
}
device.insert("framerates", availableFramerates);
availableV4L2devices.append(device);
}
grabbers["v4l2_properties"] = availableV4L2devices;
#endif
grabbers["available"] = availableGrabbers;
info["videomode"] = QString(videoMode2String(_hyperion->getCurrentVideoMode()));
info["grabbers"] = grabbers;
QJsonObject cecInfo;
#if defined(ENABLE_CEC)
cecInfo["enabled"] = true;
#else
cecInfo["enabled"] = false;
#endif
info["cec"] = cecInfo;
// get available components
QJsonArray component;
std::map<hyperion::Components, bool> components = _hyperion->getComponentRegister().getRegister();
@@ -547,7 +562,7 @@ void JsonAPI::handleServerInfoCommand(const QJsonObject& message, const QString&
// add sessions
QJsonArray sessions;
#ifdef ENABLE_AVAHI
for (auto session : BonjourBrowserWrapper::getInstance()->getAllServices())
for (auto session: BonjourBrowserWrapper::getInstance()->getAllServices())
{
if (session.port < 0)
continue;
@@ -564,7 +579,7 @@ void JsonAPI::handleServerInfoCommand(const QJsonObject& message, const QString&
#endif
// add instance info
QJsonArray instanceInfo;
for (const auto& entry : API::getAllInstanceData())
for (const auto &entry : API::getAllInstanceData())
{
QJsonObject obj;
obj.insert("friendly_name", entry["friendly_name"].toString());
@@ -586,7 +601,7 @@ void JsonAPI::handleServerInfoCommand(const QJsonObject& message, const QString&
// TRANSFORM INFORMATION (DEFAULT VALUES)
QJsonArray transformArray;
for (const QString& transformId : _hyperion->getAdjustmentIds())
for (const QString &transformId : _hyperion->getAdjustmentIds())
{
QJsonObject transform;
QJsonArray blacklevel, whitelevel, gamma, threshold;
@@ -617,7 +632,7 @@ void JsonAPI::handleServerInfoCommand(const QJsonObject& message, const QString&
// ACTIVE EFFECT INFO
QJsonArray activeEffects;
for (const ActiveEffectDefinition& activeEffectDefinition : _hyperion->getActiveEffects())
for (const ActiveEffectDefinition &activeEffectDefinition : _hyperion->getActiveEffects())
{
if (activeEffectDefinition.priority != PriorityMuxer::LOWEST_PRIORITY - 1)
{
@@ -634,15 +649,15 @@ void JsonAPI::handleServerInfoCommand(const QJsonObject& message, const QString&
// ACTIVE STATIC LED COLOR
QJsonArray activeLedColors;
const Hyperion::InputInfo& priorityInfo = _hyperion->getPriorityInfo(_hyperion->getCurrentPriority());
const Hyperion::InputInfo &priorityInfo = _hyperion->getPriorityInfo(_hyperion->getCurrentPriority());
if (priorityInfo.componentId == hyperion::COMP_COLOR && !priorityInfo.ledColors.empty())
{
QJsonObject LEDcolor;
// check if LED Color not Black (0,0,0)
if ((priorityInfo.ledColors.begin()->red +
priorityInfo.ledColors.begin()->green +
priorityInfo.ledColors.begin()->blue !=
0))
priorityInfo.ledColors.begin()->green +
priorityInfo.ledColors.begin()->blue !=
0))
{
QJsonObject LEDcolor;
@@ -659,9 +674,9 @@ void JsonAPI::handleServerInfoCommand(const QJsonObject& message, const QString&
// add HSL Value to Array
QJsonArray HSLValue;
ColorSys::rgb2hsl(priorityInfo.ledColors.begin()->red,
priorityInfo.ledColors.begin()->green,
priorityInfo.ledColors.begin()->blue,
Hue, Saturation, Luminace);
priorityInfo.ledColors.begin()->green,
priorityInfo.ledColors.begin()->blue,
Hue, Saturation, Luminace);
HSLValue.append(Hue);
HSLValue.append(Saturation);
@@ -706,7 +721,7 @@ void JsonAPI::handleServerInfoCommand(const QJsonObject& message, const QString&
}
}
void JsonAPI::handleClearCommand(const QJsonObject& message, const QString& command, int tan)
void JsonAPI::handleClearCommand(const QJsonObject &message, const QString &command, int tan)
{
emit forwardJsonMessage(message);
int priority = message["priority"].toInt();
@@ -720,7 +735,7 @@ void JsonAPI::handleClearCommand(const QJsonObject& message, const QString& comm
sendSuccessReply(command, tan);
}
void JsonAPI::handleClearallCommand(const QJsonObject& message, const QString& command, int tan)
void JsonAPI::handleClearallCommand(const QJsonObject &message, const QString &command, int tan)
{
emit forwardJsonMessage(message);
QString replyMsg;
@@ -728,12 +743,12 @@ void JsonAPI::handleClearallCommand(const QJsonObject& message, const QString& c
sendSuccessReply(command, tan);
}
void JsonAPI::handleAdjustmentCommand(const QJsonObject& message, const QString& command, int tan)
void JsonAPI::handleAdjustmentCommand(const QJsonObject &message, const QString &command, int tan)
{
const QJsonObject& adjustment = message["adjustment"].toObject();
const QJsonObject &adjustment = message["adjustment"].toObject();
const QString adjustmentId = adjustment["id"].toString(_hyperion->getAdjustmentIds().first());
ColorAdjustment* colorAdjustment = _hyperion->getAdjustment(adjustmentId);
ColorAdjustment *colorAdjustment = _hyperion->getAdjustment(adjustmentId);
if (colorAdjustment == nullptr)
{
Warning(_log, "Incorrect adjustment identifier: %s", adjustmentId.toStdString().c_str());
@@ -742,39 +757,39 @@ void JsonAPI::handleAdjustmentCommand(const QJsonObject& message, const QString&
if (adjustment.contains("red"))
{
const QJsonArray& values = adjustment["red"].toArray();
const QJsonArray &values = adjustment["red"].toArray();
colorAdjustment->_rgbRedAdjustment.setAdjustment(values[0u].toInt(), values[1u].toInt(), values[2u].toInt());
}
if (adjustment.contains("green"))
{
const QJsonArray& values = adjustment["green"].toArray();
const QJsonArray &values = adjustment["green"].toArray();
colorAdjustment->_rgbGreenAdjustment.setAdjustment(values[0u].toInt(), values[1u].toInt(), values[2u].toInt());
}
if (adjustment.contains("blue"))
{
const QJsonArray& values = adjustment["blue"].toArray();
const QJsonArray &values = adjustment["blue"].toArray();
colorAdjustment->_rgbBlueAdjustment.setAdjustment(values[0u].toInt(), values[1u].toInt(), values[2u].toInt());
}
if (adjustment.contains("cyan"))
{
const QJsonArray& values = adjustment["cyan"].toArray();
const QJsonArray &values = adjustment["cyan"].toArray();
colorAdjustment->_rgbCyanAdjustment.setAdjustment(values[0u].toInt(), values[1u].toInt(), values[2u].toInt());
}
if (adjustment.contains("magenta"))
{
const QJsonArray& values = adjustment["magenta"].toArray();
const QJsonArray &values = adjustment["magenta"].toArray();
colorAdjustment->_rgbMagentaAdjustment.setAdjustment(values[0u].toInt(), values[1u].toInt(), values[2u].toInt());
}
if (adjustment.contains("yellow"))
{
const QJsonArray& values = adjustment["yellow"].toArray();
const QJsonArray &values = adjustment["yellow"].toArray();
colorAdjustment->_rgbYellowAdjustment.setAdjustment(values[0u].toInt(), values[1u].toInt(), values[2u].toInt());
}
if (adjustment.contains("white"))
{
const QJsonArray& values = adjustment["white"].toArray();
const QJsonArray &values = adjustment["white"].toArray();
colorAdjustment->_rgbWhiteAdjustment.setAdjustment(values[0u].toInt(), values[1u].toInt(), values[2u].toInt());
}
@@ -814,7 +829,7 @@ void JsonAPI::handleAdjustmentCommand(const QJsonObject& message, const QString&
sendSuccessReply(command, tan);
}
void JsonAPI::handleSourceSelectCommand(const QJsonObject& message, const QString& command, int tan)
void JsonAPI::handleSourceSelectCommand(const QJsonObject &message, const QString &command, int tan)
{
if (message.contains("auto"))
{
@@ -832,7 +847,7 @@ void JsonAPI::handleSourceSelectCommand(const QJsonObject& message, const QStrin
sendSuccessReply(command, tan);
}
void JsonAPI::handleConfigCommand(const QJsonObject& message, const QString& command, int tan)
void JsonAPI::handleConfigCommand(const QJsonObject &message, const QString &command, int tan)
{
QString subcommand = message["subcommand"].toString("");
QString full_command = command + "-" + subcommand;
@@ -876,14 +891,14 @@ void JsonAPI::handleConfigCommand(const QJsonObject& message, const QString& com
}
}
void JsonAPI::handleConfigSetCommand(const QJsonObject& message, const QString& command, int tan)
void JsonAPI::handleConfigSetCommand(const QJsonObject &message, const QString &command, int tan)
{
if (message.contains("config"))
{
QJsonObject config = message["config"].toObject();
if (API::isHyperionEnabled())
{
if (API::saveSettings(config))
if ( API::saveSettings(config) )
{
sendSuccessReply(command, tan);
}
@@ -897,7 +912,7 @@ void JsonAPI::handleConfigSetCommand(const QJsonObject& message, const QString&
}
}
void JsonAPI::handleSchemaGetCommand(const QJsonObject& message, const QString& command, int tan)
void JsonAPI::handleSchemaGetCommand(const QJsonObject &message, const QString &command, int tan)
{
// create result
QJsonObject schemaJson, alldevices, properties;
@@ -912,7 +927,7 @@ void JsonAPI::handleSchemaGetCommand(const QJsonObject& message, const QString&
{
schemaJson = QJsonFactory::readSchema(schemaFile);
}
catch (const std::runtime_error& error)
catch (const std::runtime_error &error)
{
throw std::runtime_error(error.what());
}
@@ -949,9 +964,9 @@ void JsonAPI::handleSchemaGetCommand(const QJsonObject& message, const QString&
sendSuccessDataReply(QJsonDocument(schemaJson), command, tan);
}
void JsonAPI::handleComponentStateCommand(const QJsonObject& message, const QString& command, int tan)
void JsonAPI::handleComponentStateCommand(const QJsonObject &message, const QString &command, int tan)
{
const QJsonObject& componentState = message["componentstate"].toObject();
const QJsonObject &componentState = message["componentstate"].toObject();
QString comp = componentState["component"].toString("invalid");
bool compState = componentState["state"].toBool(true);
QString replyMsg;
@@ -964,7 +979,7 @@ void JsonAPI::handleComponentStateCommand(const QJsonObject& message, const QStr
sendSuccessReply(command, tan);
}
void JsonAPI::handleLedColorsCommand(const QJsonObject& message, const QString& command, int tan)
void JsonAPI::handleLedColorsCommand(const QJsonObject &message, const QString &command, int tan)
{
// create result
QString subcommand = message["subcommand"].toString("");
@@ -978,22 +993,22 @@ void JsonAPI::handleLedColorsCommand(const QJsonObject& message, const QString&
_streaming_leds_reply["command"] = command + "-ledstream-update";
_streaming_leds_reply["tan"] = tan;
connect(_hyperion, &Hyperion::rawLedColors, this, [=](const std::vector<ColorRgb>& ledValues) {
connect(_hyperion, &Hyperion::rawLedColors, this, [=](const std::vector<ColorRgb> &ledValues) {
_currentLedValues = ledValues;
// necessary because Qt::UniqueConnection for lambdas does not work until 5.9
// see: https://bugreports.qt.io/browse/QTBUG-52438
if (!_ledStreamConnection)
_ledStreamConnection = connect(_ledStreamTimer, &QTimer::timeout, this, [=]() {
emit streamLedcolorsUpdate(_currentLedValues);
},
Qt::UniqueConnection);
emit streamLedcolorsUpdate(_currentLedValues);
},
Qt::UniqueConnection);
// start the timer
if (!_ledStreamTimer->isActive() || _ledStreamTimer->interval() != streaming_interval)
_ledStreamTimer->start(streaming_interval);
},
Qt::UniqueConnection);
},
Qt::UniqueConnection);
// push once
_hyperion->update();
}
@@ -1023,7 +1038,7 @@ void JsonAPI::handleLedColorsCommand(const QJsonObject& message, const QString&
sendSuccessReply(command + "-" + subcommand, tan);
}
void JsonAPI::handleLoggingCommand(const QJsonObject& message, const QString& command, int tan)
void JsonAPI::handleLoggingCommand(const QJsonObject &message, const QString &command, int tan)
{
// create result
QString subcommand = message["subcommand"].toString("");
@@ -1065,25 +1080,25 @@ void JsonAPI::handleLoggingCommand(const QJsonObject& message, const QString& co
}
}
void JsonAPI::handleProcessingCommand(const QJsonObject& message, const QString& command, int tan)
void JsonAPI::handleProcessingCommand(const QJsonObject &message, const QString &command, int tan)
{
API::setLedMappingType(ImageProcessor::mappingTypeToInt(message["mappingType"].toString("multicolor_mean")));
sendSuccessReply(command, tan);
}
void JsonAPI::handleVideoModeCommand(const QJsonObject& message, const QString& command, int tan)
void JsonAPI::handleVideoModeCommand(const QJsonObject &message, const QString &command, int tan)
{
API::setVideoMode(parse3DMode(message["videoMode"].toString("2D")));
sendSuccessReply(command, tan);
}
void JsonAPI::handleAuthorizeCommand(const QJsonObject& message, const QString& command, int tan)
void JsonAPI::handleAuthorizeCommand(const QJsonObject &message, const QString &command, int tan)
{
const QString& subc = message["subcommand"].toString().trimmed();
const QString& id = message["id"].toString().trimmed();
const QString& password = message["password"].toString().trimmed();
const QString& newPassword = message["newPassword"].toString().trimmed();
const QString& comment = message["comment"].toString().trimmed();
const QString &subc = message["subcommand"].toString().trimmed();
const QString &id = message["id"].toString().trimmed();
const QString &password = message["password"].toString().trimmed();
const QString &newPassword = message["newPassword"].toString().trimmed();
const QString &comment = message["comment"].toString().trimmed();
// catch test if auth is required
if (subc == "tokenRequired")
@@ -1194,8 +1209,8 @@ void JsonAPI::handleAuthorizeCommand(const QJsonObject& message, const QString&
if (subc == "requestToken")
{
// use id/comment
const QString& comment = message["comment"].toString().trimmed();
const bool& acc = message["accept"].toBool(true);
const QString &comment = message["comment"].toString().trimmed();
const bool &acc = message["accept"].toBool(true);
if (acc)
API::setNewTokenRequest(comment, id, tan);
else
@@ -1211,7 +1226,7 @@ void JsonAPI::handleAuthorizeCommand(const QJsonObject& message, const QString&
if (API::getPendingTokenRequests(vec))
{
QJsonArray arr;
for (const auto& entry : vec)
for (const auto &entry : vec)
{
QJsonObject obj;
obj["comment"] = entry.comment;
@@ -1233,7 +1248,7 @@ void JsonAPI::handleAuthorizeCommand(const QJsonObject& message, const QString&
if (subc == "answerRequest")
{
// use id
const bool& accept = message["accept"].toBool(false);
const bool &accept = message["accept"].toBool(false);
if (!API::handlePendingTokenRequest(id, accept))
sendErrorReply("No Authorization", command + "-" + subc, tan);
return;
@@ -1246,7 +1261,7 @@ void JsonAPI::handleAuthorizeCommand(const QJsonObject& message, const QString&
if (API::getTokenList(defVect))
{
QJsonArray tArr;
for (const auto& entry : defVect)
for (const auto &entry : defVect)
{
QJsonObject subO;
subO["comment"] = entry.comment;
@@ -1265,7 +1280,7 @@ void JsonAPI::handleAuthorizeCommand(const QJsonObject& message, const QString&
// login
if (subc == "login")
{
const QString& token = message["token"].toString().trimmed();
const QString &token = message["token"].toString().trimmed();
// catch token
if (!token.isEmpty())
@@ -1313,11 +1328,11 @@ void JsonAPI::handleAuthorizeCommand(const QJsonObject& message, const QString&
}
}
void JsonAPI::handleInstanceCommand(const QJsonObject& message, const QString& command, int tan)
void JsonAPI::handleInstanceCommand(const QJsonObject &message, const QString &command, int tan)
{
const QString& subc = message["subcommand"].toString();
const quint8& inst = message["instance"].toInt();
const QString& name = message["name"].toString();
const QString &subc = message["subcommand"].toString();
const quint8 &inst = message["instance"].toInt();
const QString &name = message["name"].toString();
if (subc == "switchTo")
{
@@ -1334,7 +1349,7 @@ void JsonAPI::handleInstanceCommand(const QJsonObject& message, const QString& c
if (subc == "startInstance")
{
connect(this, &API::onStartInstanceResponse, [=](const int& tan) { sendSuccessReply(command + "-" + subc, tan); });
connect(this, &API::onStartInstanceResponse, [=] (const int &tan) { sendSuccessReply(command + "-" + subc, tan); });
if (!API::startInstance(inst, tan))
sendErrorReply("Can't start Hyperion instance index " + QString::number(inst), command + "-" + subc, tan);
@@ -1384,12 +1399,12 @@ void JsonAPI::handleInstanceCommand(const QJsonObject& message, const QString& c
}
}
void JsonAPI::handleLedDeviceCommand(const QJsonObject& message, const QString& command, int tan)
void JsonAPI::handleLedDeviceCommand(const QJsonObject &message, const QString &command, int tan)
{
Debug(_log, "message: [%s]", QString(QJsonDocument(message).toJson(QJsonDocument::Compact)).toUtf8().constData());
Debug(_log, "message: [%s]", QString(QJsonDocument(message).toJson(QJsonDocument::Compact)).toUtf8().constData() );
const QString& subc = message["subcommand"].toString().trimmed();
const QString& devType = message["ledDeviceType"].toString().trimmed();
const QString &subc = message["subcommand"].toString().trimmed();
const QString &devType = message["ledDeviceType"].toString().trimmed();
QString full_command = command + "-" + subc;
@@ -1399,7 +1414,7 @@ void JsonAPI::handleLedDeviceCommand(const QJsonObject& message, const QString&
sendErrorReply("Unknown device", full_command, tan);
}
else
*/ {
*/ {
QJsonObject config;
config.insert("type", devType);
LedDevice* ledDevice = nullptr;
@@ -1407,27 +1422,27 @@ void JsonAPI::handleLedDeviceCommand(const QJsonObject& message, const QString&
if (subc == "discover")
{
ledDevice = LedDeviceFactory::construct(config);
const QJsonObject& params = message["params"].toObject();
const QJsonObject &params = message["params"].toObject();
const QJsonObject devicesDiscovered = ledDevice->discover(params);
Debug(_log, "response: [%s]", QString(QJsonDocument(devicesDiscovered).toJson(QJsonDocument::Compact)).toUtf8().constData());
Debug(_log, "response: [%s]", QString(QJsonDocument(devicesDiscovered).toJson(QJsonDocument::Compact)).toUtf8().constData() );
sendSuccessDataReply(QJsonDocument(devicesDiscovered), full_command, tan);
}
else if (subc == "getProperties")
{
ledDevice = LedDeviceFactory::construct(config);
const QJsonObject& params = message["params"].toObject();
const QJsonObject &params = message["params"].toObject();
const QJsonObject deviceProperties = ledDevice->getProperties(params);
Debug(_log, "response: [%s]", QString(QJsonDocument(deviceProperties).toJson(QJsonDocument::Compact)).toUtf8().constData());
Debug(_log, "response: [%s]", QString(QJsonDocument(deviceProperties).toJson(QJsonDocument::Compact)).toUtf8().constData() );
sendSuccessDataReply(QJsonDocument(deviceProperties), full_command, tan);
}
else if (subc == "identify")
{
ledDevice = LedDeviceFactory::construct(config);
const QJsonObject& params = message["params"].toObject();
const QJsonObject &params = message["params"].toObject();
ledDevice->identify(params);
sendSuccessReply(full_command, tan);
@@ -1441,12 +1456,152 @@ void JsonAPI::handleLedDeviceCommand(const QJsonObject& message, const QString&
}
}
void JsonAPI::handleNotImplemented(const QString& command, int tan)
void JsonAPI::handleInputSourceCommand(const QJsonObject& message, const QString& command, int tan)
{
DebugIf(verbose, _log, "message: [%s]", QString(QJsonDocument(message).toJson(QJsonDocument::Compact)).toUtf8().constData());
const QString& subc = message["subcommand"].toString().trimmed();
const QString& sourceType = message["sourceType"].toString().trimmed();
QString full_command = command + "-" + subc;
// TODO: Validate that source type is a valid one
/* if ( ! valid type )
{
sendErrorReply("Unknown device", full_command, tan);
}
else
*/ {
if (subc == "discover")
{
QJsonObject inputSourcesDiscovered;
inputSourcesDiscovered.insert("sourceType", sourceType);
QJsonArray videoInputs;
#if defined(ENABLE_V4L2) || defined(ENABLE_MF)
if (sourceType == "video" )
{
#if defined(ENABLE_MF)
MFGrabber* grabber = new MFGrabber();
#elif defined(ENABLE_V4L2)
V4L2Grabber* grabber = new V4L2Grabber();
#endif
QJsonObject params;
videoInputs = grabber->discover(params);
delete grabber;
}
else
#endif
{
DebugIf(verbose, _log, "sourceType: [%s]", QSTRING_CSTR(sourceType));
if (sourceType == "screen")
{
QJsonObject params;
QJsonObject device;
#ifdef ENABLE_QT
QtGrabber* qtgrabber = new QtGrabber();
device = qtgrabber->discover(params);
if (!device.isEmpty() )
{
videoInputs.append(device);
}
delete qtgrabber;
#endif
#ifdef ENABLE_DX
DirectXGrabber* dxgrabber = new DirectXGrabber();
device = dxgrabber->discover(params);
if (!device.isEmpty() )
{
videoInputs.append(device);
}
delete dxgrabber;
#endif
#ifdef ENABLE_X11
X11Grabber* x11Grabber = new X11Grabber();
device = x11Grabber->discover(params);
if (!device.isEmpty() )
{
videoInputs.append(device);
}
delete x11Grabber;
#endif
#ifdef ENABLE_XCB
XcbGrabber* xcbGrabber = new XcbGrabber();
device = xcbGrabber->discover(params);
if (!device.isEmpty() )
{
videoInputs.append(device);
}
delete xcbGrabber;
#endif
#ifdef ENABLE_FB
FramebufferFrameGrabber* fbGrabber = new FramebufferFrameGrabber();
device = fbGrabber->discover(params);
if (!device.isEmpty() )
{
videoInputs.append(device);
}
delete fbGrabber;
#endif
#if defined(ENABLE_DISPMANX)
DispmanxFrameGrabber* dispmanx = new DispmanxFrameGrabber();
device = dispmanx->discover(params);
if (!device.isEmpty() )
{
videoInputs.append(device);
}
delete dispmanx;
#endif
#if defined(ENABLE_AMLOGIC)
AmlogicGrabber* amlGrabber = new AmlogicGrabber();
device = amlGrabber->discover(params);
if (!device.isEmpty() )
{
videoInputs.append(device);
}
delete amlGrabber;
#endif
#if defined(ENABLE_OSX)
OsxFrameGrabber* osxGrabber = new OsxFrameGrabber();
device = osxGrabber->discover(params);
if (!device.isEmpty() )
{
videoInputs.append(device);
}
delete osxGrabber;
#endif
}
}
inputSourcesDiscovered["video_sources"] = videoInputs;
DebugIf(verbose, _log, "response: [%s]", QString(QJsonDocument(inputSourcesDiscovered).toJson(QJsonDocument::Compact)).toUtf8().constData());
sendSuccessDataReply(QJsonDocument(inputSourcesDiscovered), full_command, tan);
}
else
{
sendErrorReply("Unknown or missing subcommand", full_command, tan);
}
}
}
void JsonAPI::handleNotImplemented(const QString &command, int tan)
{
sendErrorReply("Command not implemented", command, tan);
}
void JsonAPI::sendSuccessReply(const QString& command, int tan)
void JsonAPI::sendSuccessReply(const QString &command, int tan)
{
// create reply
QJsonObject reply;
@@ -1458,7 +1613,7 @@ void JsonAPI::sendSuccessReply(const QString& command, int tan)
emit callbackMessage(reply);
}
void JsonAPI::sendSuccessDataReply(const QJsonDocument& doc, const QString& command, int tan)
void JsonAPI::sendSuccessDataReply(const QJsonDocument &doc, const QString &command, int tan)
{
QJsonObject reply;
reply["success"] = true;
@@ -1472,7 +1627,7 @@ void JsonAPI::sendSuccessDataReply(const QJsonDocument& doc, const QString& comm
emit callbackMessage(reply);
}
void JsonAPI::sendErrorReply(const QString& error, const QString& command, int tan)
void JsonAPI::sendErrorReply(const QString &error, const QString &command, int tan)
{
// create reply
QJsonObject reply;
@@ -1485,12 +1640,12 @@ void JsonAPI::sendErrorReply(const QString& error, const QString& command, int t
emit callbackMessage(reply);
}
void JsonAPI::streamLedcolorsUpdate(const std::vector<ColorRgb>& ledColors)
void JsonAPI::streamLedcolorsUpdate(const std::vector<ColorRgb> &ledColors)
{
QJsonObject result;
QJsonArray leds;
for (const auto& color : ledColors)
for (const auto &color : ledColors)
{
leds << QJsonValue(color.red) << QJsonValue(color.green) << QJsonValue(color.blue);
}
@@ -1502,9 +1657,9 @@ void JsonAPI::streamLedcolorsUpdate(const std::vector<ColorRgb>& ledColors)
emit callbackMessage(_streaming_leds_reply);
}
void JsonAPI::setImage(const Image<ColorRgb>& image)
void JsonAPI::setImage(const Image<ColorRgb> &image)
{
QImage jpgImage((const uint8_t*)image.memptr(), image.width(), image.height(), 3 * image.width(), QImage::Format_RGB888);
QImage jpgImage((const uint8_t *)image.memptr(), image.width(), image.height(), 3 * image.width(), QImage::Format_RGB888);
QByteArray ba;
QBuffer buffer(&ba);
buffer.open(QIODevice::WriteOnly);
@@ -1516,7 +1671,7 @@ void JsonAPI::setImage(const Image<ColorRgb>& image)
emit callbackMessage(_streaming_image_reply);
}
void JsonAPI::incommingLogMessage(const Logger::T_LOG_MESSAGE& msg)
void JsonAPI::incommingLogMessage(const Logger::T_LOG_MESSAGE &msg)
{
QJsonObject result, message;
QJsonArray messageArray;
@@ -1524,7 +1679,7 @@ void JsonAPI::incommingLogMessage(const Logger::T_LOG_MESSAGE& msg)
if (!_streaming_logging_activated)
{
_streaming_logging_activated = true;
const QList<Logger::T_LOG_MESSAGE>* logBuffer = LoggerManager::getInstance()->getLogMessageBuffer();
const QList<Logger::T_LOG_MESSAGE> *logBuffer = LoggerManager::getInstance()->getLogMessageBuffer();
for (int i = 0; i < logBuffer->length(); i++)
{
message["appName"] = logBuffer->at(i).appName;
@@ -1560,7 +1715,7 @@ void JsonAPI::incommingLogMessage(const Logger::T_LOG_MESSAGE& msg)
emit callbackMessage(_streaming_logging_reply);
}
void JsonAPI::newPendingTokenRequest(const QString& id, const QString& comment)
void JsonAPI::newPendingTokenRequest(const QString &id, const QString &comment)
{
QJsonObject obj;
obj["comment"] = comment;
@@ -1570,7 +1725,7 @@ void JsonAPI::newPendingTokenRequest(const QString& id, const QString& comment)
sendSuccessDataReply(QJsonDocument(obj), "authorize-tokenRequest", 1);
}
void JsonAPI::handleTokenResponse(bool success, const QString& token, const QString& comment, const QString& id, const int& tan)
void JsonAPI::handleTokenResponse(bool success, const QString &token, const QString &comment, const QString &id, const int &tan)
{
const QString cmd = "authorize-requestToken";
QJsonObject result;
@@ -1584,7 +1739,7 @@ void JsonAPI::handleTokenResponse(bool success, const QString& token, const QStr
sendErrorReply("Token request timeout or denied", cmd, tan);
}
void JsonAPI::handleInstanceStateChange(InstanceState state, quint8 instance, const QString& name)
void JsonAPI::handleInstanceStateChange(InstanceState state, quint8 instance, const QString &name)
{
switch (state)
{

View File

@@ -19,6 +19,7 @@ Effect::Effect(Hyperion *hyperion, int priority, int timeout, const QString &scr
, _hyperion(hyperion)
, _priority(priority)
, _timeout(timeout)
, _isEndless(timeout <= ENDLESS)
, _script(script)
, _name(name)
, _args(args)
@@ -51,7 +52,7 @@ Effect::~Effect()
bool Effect::isInterruptionRequested()
{
return _interupt || getRemaining() < ENDLESS;
return _interupt || (!_isEndless && getRemaining() <= 0);
}
int Effect::getRemaining() const
@@ -59,12 +60,11 @@ int Effect::getRemaining() const
// determine the timeout
int timeout = _timeout;
if (timeout > 0)
if (timeout >= 0)
{
timeout = static_cast<int>( _endTime - QDateTime::currentMSecsSinceEpoch());
return timeout;
}
return ENDLESS;
return timeout;
}
void Effect::setModuleParameters()

View File

@@ -12,24 +12,24 @@ endif (ENABLE_FB)
if (ENABLE_OSX)
add_subdirectory(osx)
endif()
endif(ENABLE_OSX)
if (ENABLE_V4L2)
add_subdirectory(v4l2)
endif (ENABLE_V4L2)
if (ENABLE_V4L2 OR ENABLE_MF)
add_subdirectory(video)
endif ()
if (ENABLE_X11)
add_subdirectory(x11)
endif()
endif(ENABLE_X11)
if (ENABLE_XCB)
add_subdirectory(xcb)
endif()
endif(ENABLE_XCB)
if (ENABLE_QT)
add_subdirectory(qt)
endif()
endif(ENABLE_QT)
if (ENABLE_DX)
add_subdirectory(directx)
endif()
endif(ENABLE_DX)

View File

@@ -2,7 +2,6 @@
#include <algorithm>
#include <cassert>
#include <iostream>
#include <QFile>
// Linux includes
#include <errno.h>
@@ -12,156 +11,323 @@
#include <sys/stat.h>
#include <sys/types.h>
// qt
#include <QFile>
#include <QJsonObject>
#include <QJsonArray>
#include <QJsonDocument>
#include <QSize>
// Local includes
#include <utils/Logger.h>
#include <grabber/AmlogicGrabber.h>
#include "Amvideocap.h"
#define VIDEO_DEVICE "/dev/amvideo"
#define CAPTURE_DEVICE "/dev/amvideocap0"
// Constants
namespace {
const bool verbose = false;
AmlogicGrabber::AmlogicGrabber(unsigned width, unsigned height)
: Grabber("AMLOGICGRABBER", qMax(160u, width), qMax(160u, height)) // Minimum required width or height is 160
, _captureDev(-1)
, _videoDev(-1)
, _lastError(0)
, _fbGrabber("/dev/fb0",width,height)
, _grabbingModeNotification(0)
const char DEFAULT_FB_DEVICE[] = "/dev/fb0";
const char DEFAULT_VIDEO_DEVICE[] = "/dev/amvideo";
const char DEFAULT_CAPTURE_DEVICE[] = "/dev/amvideocap0";
const int AMVIDEOCAP_WAIT_MAX_MS = 50;
} //End of constants
AmlogicGrabber::AmlogicGrabber()
: Grabber("AMLOGICGRABBER") // Minimum required width or height is 160
, _captureDev(-1)
, _videoDev(-1)
, _lastError(0)
, _fbGrabber(DEFAULT_FB_DEVICE)
, _grabbingModeNotification(0)
{
Debug(_log, "constructed(%d x %d), grabber device: %s",_width,_height, CAPTURE_DEVICE);
_image_bgr.resize(_width, _height);
_bytesToRead = _image_bgr.size();
_image_ptr = _image_bgr.memptr();
_useImageResampler = true;
}
AmlogicGrabber::~AmlogicGrabber()
{
closeDev(_captureDev);
closeDev(_videoDev);
closeDevice(_captureDev);
closeDevice(_videoDev);
}
bool AmlogicGrabber::openDev(int &fd, const char* dev)
bool AmlogicGrabber::setupScreen()
{
bool rc (false);
QSize screenSize = _fbGrabber.getScreenSize(DEFAULT_FB_DEVICE);
if ( !screenSize.isEmpty() )
{
if (setWidthHeight(screenSize.width(), screenSize.height()))
{
rc = _fbGrabber.setupScreen();
}
}
return rc;
}
bool AmlogicGrabber::openDevice(int &fd, const char* dev)
{
if (fd<0)
{
fd = open(dev, O_RDWR);
fd = ::open(dev, O_RDWR);
}
return fd >= 0;
}
void AmlogicGrabber::closeDev(int &fd)
void AmlogicGrabber::closeDevice(int &fd)
{
if (fd >= 0)
{
close(fd);
::close(fd);
fd = -1;
}
}
bool AmlogicGrabber::isVideoPlaying()
{
if(!QFile::exists(VIDEO_DEVICE)) return false;
bool rc = false;
if(QFile::exists(DEFAULT_VIDEO_DEVICE))
{
int videoDisabled = 1;
if (!openDev(_videoDev, VIDEO_DEVICE))
{
Error(_log, "Failed to open video device(%s): %d - %s", VIDEO_DEVICE, errno, strerror(errno));
return false;
}
else
{
// Check the video disabled flag
if(ioctl(_videoDev, AMSTREAM_IOC_GET_VIDEO_DISABLE, &videoDisabled) < 0)
int videoDisabled = 1;
if (!openDevice(_videoDev, DEFAULT_VIDEO_DEVICE))
{
Error(_log, "Failed to retrieve video state from device: %d - %s", errno, strerror(errno));
closeDev(_videoDev);
return false;
Error(_log, "Failed to open video device(%s): %d - %s", DEFAULT_VIDEO_DEVICE, errno, strerror(errno));
}
else
{
// Check the video disabled flag
if(ioctl(_videoDev, AMSTREAM_IOC_GET_VIDEO_DISABLE, &videoDisabled) < 0)
{
Error(_log, "Failed to retrieve video state from device: %d - %s", errno, strerror(errno));
closeDevice(_videoDev);
}
else
{
if ( videoDisabled == 0 )
{
rc = true;
}
}
}
}
return videoDisabled == 0;
}
return rc;
}
int AmlogicGrabber::grabFrame(Image<ColorRgb> & image)
{
if (!_enabled) return 0;
// Make sure video is playing, else there is nothing to grab
if (isVideoPlaying())
int rc = 0;
if (_isEnabled && !_isDeviceInError)
{
if (_grabbingModeNotification!=1)
// Make sure video is playing, else there is nothing to grab
if (isVideoPlaying())
{
Info(_log, "VPU mode");
_grabbingModeNotification = 1;
_lastError = 0;
}
if (_grabbingModeNotification!=1)
{
Info(_log, "Switch to VPU capture mode");
_grabbingModeNotification = 1;
_lastError = 0;
}
if (grabFrame_amvideocap(image) < 0)
closeDev(_captureDev);
}
else
{
if (_grabbingModeNotification!=2)
if (grabFrame_amvideocap(image) < 0) {
closeDevice(_captureDev);
rc = -1;
}
}
else
{
Info( _log, "FB mode");
_grabbingModeNotification = 2;
_lastError = 0;
if (_grabbingModeNotification!=2)
{
Info( _log, "Switch to Framebuffer capture mode");
_grabbingModeNotification = 2;
_lastError = 0;
}
rc = _fbGrabber.grabFrame(image);
//usleep(50 * 1000);
}
_fbGrabber.grabFrame(image);
usleep(50 * 1000);
}
return 0;
return rc;
}
int AmlogicGrabber::grabFrame_amvideocap(Image<ColorRgb> & image)
{
int rc = 0;
// If the device is not open, attempt to open it
if (_captureDev < 0)
{
if (! openDev(_captureDev, CAPTURE_DEVICE))
if (! openDevice(_captureDev, DEFAULT_CAPTURE_DEVICE))
{
ErrorIf( _lastError != 1, _log,"Failed to open the AMLOGIC device (%d - %s):", errno, strerror(errno));
_lastError = 1;
return -1;
rc = -1;
return rc;
}
}
long r1 = ioctl(_captureDev, AMVIDEOCAP_IOW_SET_WANTFRAME_WIDTH, _width);
long r2 = ioctl(_captureDev, AMVIDEOCAP_IOW_SET_WANTFRAME_HEIGHT, _height);
long r3 = ioctl(_captureDev, AMVIDEOCAP_IOW_SET_WANTFRAME_AT_FLAGS, CAP_FLAG_AT_END);
long r4 = ioctl(_captureDev, AMVIDEOCAP_IOW_SET_WANTFRAME_WAIT_MAX_MS, 500);
long r1 = ioctl(_captureDev, AMVIDEOCAP_IOW_SET_WANTFRAME_WIDTH, _width);
long r2 = ioctl(_captureDev, AMVIDEOCAP_IOW_SET_WANTFRAME_HEIGHT, _height);
long r3 = ioctl(_captureDev, AMVIDEOCAP_IOW_SET_WANTFRAME_AT_FLAGS, CAP_FLAG_AT_END);
long r4 = ioctl(_captureDev, AMVIDEOCAP_IOW_SET_WANTFRAME_WAIT_MAX_MS, AMVIDEOCAP_WAIT_MAX_MS);
if (r1<0 || r2<0 || r3<0 || r4<0 || _height==0 || _width==0)
if (r1<0 || r2<0 || r3<0 || r4<0 || _height==0 || _width==0)
{
ErrorIf(_lastError != 2,_log,"Failed to configure capture device (%d - %s)", errno, strerror(errno));
_lastError = 2;
rc = -1;
}
else
{
int linelen = ((_width + 31) & ~31) * 3;
size_t _bytesToRead = linelen * _height;
// Read the snapshot into the memory
ssize_t bytesRead = pread(_captureDev, _image_ptr, _bytesToRead, 0);
if (bytesRead < 0)
{
ErrorIf(_lastError != 2,_log,"Failed to configure capture device (%d - %s)", errno, strerror(errno));
_lastError = 2;
return -1;
int state;
ioctl(_captureDev, AMVIDEOCAP_IOR_GET_STATE, &state);
if (state == AMVIDEOCAP_STATE_ON_CAPTURE)
{
DebugIf(_lastError != 5, _log,"Video playback has been paused");
_lastError = 5;
}
else
{
ErrorIf(_lastError != 3, _log,"Read of device failed: %d - %s", errno, strerror(errno));
_lastError = 3;
}
rc = -1;
}
else
{
if (static_cast<ssize_t>(_bytesToRead) != bytesRead)
{
// Read of snapshot failed
ErrorIf(_lastError != 4, _log,"Capture failed to grab entire image [bytesToRead(%d) != bytesRead(%d)]", _bytesToRead, bytesRead);
_lastError = 4;
rc = -1;
}
else {
_imageResampler.processImage(static_cast<uint8_t*>(_image_ptr),
_width,
_height,
linelen,
PixelFormat::BGR24, image);
_lastError = 0;
rc = 0;
}
}
}
return rc;
}
QJsonObject AmlogicGrabber::discover(const QJsonObject& params)
{
DebugIf(verbose, _log, "params: [%s]", QString(QJsonDocument(params).toJson(QJsonDocument::Compact)).toUtf8().constData());
QJsonObject inputsDiscovered;
if(QFile::exists(DEFAULT_VIDEO_DEVICE) && QFile::exists(DEFAULT_CAPTURE_DEVICE) )
{
QJsonArray video_inputs;
QSize screenSize = _fbGrabber.getScreenSize();
if ( !screenSize.isEmpty() )
{
int fbIdx = _fbGrabber.getPath().rightRef(1).toInt();
DebugIf(verbose, _log, "FB device [%s] found with resolution: %dx%d", QSTRING_CSTR(_fbGrabber.getPath()), screenSize.width(), screenSize.height());
QJsonArray fps = { 1, 5, 10, 15, 20, 25, 30, 40, 50, 60 };
QJsonObject in;
QString displayName;
displayName = QString("Display%1").arg(fbIdx);
in["name"] = displayName;
in["inputIdx"] = fbIdx;
QJsonArray formats;
QJsonObject format;
QJsonArray resolutionArray;
QJsonObject resolution;
resolution["width"] = screenSize.width();
resolution["height"] = screenSize.height();
resolution["fps"] = fps;
resolutionArray.append(resolution);
format["resolutions"] = resolutionArray;
formats.append(format);
in["formats"] = formats;
video_inputs.append(in);
}
if (!video_inputs.isEmpty())
{
inputsDiscovered["device"] = "amlogic";
inputsDiscovered["device_name"] = "AmLogic";
inputsDiscovered["type"] = "screen";
inputsDiscovered["video_inputs"] = video_inputs;
}
}
// Read the snapshot into the memory
ssize_t bytesRead = pread(_captureDev, _image_ptr, _bytesToRead, 0);
if (bytesRead < 0)
if (inputsDiscovered.isEmpty())
{
ErrorIf(_lastError != 3, _log,"Read of device failed: %d - %s", errno, strerror(errno));
_lastError = 3;
return -1;
}
else if (_bytesToRead != bytesRead)
{
// Read of snapshot failed
ErrorIf(_lastError != 4, _log,"Capture failed to grab entire image [bytesToRead(%d) != bytesRead(%d)]", _bytesToRead, bytesRead);
_lastError = 4;
return -1;
DebugIf(verbose, _log, "No displays found to capture from!");
}
_useImageResampler = true;
_imageResampler.processImage((const uint8_t*)_image_ptr, _width, _height, (_width << 1) + _width, PixelFormat::BGR24, image);
_lastError = 0;
DebugIf(verbose, _log, "device: [%s]", QString(QJsonDocument(inputsDiscovered).toJson(QJsonDocument::Compact)).toUtf8().constData());
return 0;
return inputsDiscovered;
}
void AmlogicGrabber::setVideoMode(VideoMode mode)
{
Grabber::setVideoMode(mode);
_fbGrabber.setVideoMode(mode);
}
bool AmlogicGrabber::setPixelDecimation(int pixelDecimation)
{
return ( Grabber::setPixelDecimation( pixelDecimation) &&
_fbGrabber.setPixelDecimation( pixelDecimation));
}
void AmlogicGrabber::setCropping(int cropLeft, int cropRight, int cropTop, int cropBottom)
{
Grabber::setCropping(cropLeft, cropRight, cropTop, cropBottom);
_fbGrabber.setCropping(cropLeft, cropRight, cropTop, cropBottom);
}
bool AmlogicGrabber::setWidthHeight(int width, int height)
{
bool rc (false);
if ( Grabber::setWidthHeight(width, height) )
{
_image_bgr.resize(static_cast<unsigned>(width), static_cast<unsigned>(height));
_width = width;
_height = height;
_bytesToRead = _image_bgr.size();
_image_ptr = _image_bgr.memptr();
rc = _fbGrabber.setWidthHeight(width, height);
}
return rc;
}
bool AmlogicGrabber::setFramerate(int fps)
{
return (Grabber::setFramerate(fps) &&
_fbGrabber.setFramerate(fps));
}

View File

@@ -1,9 +1,11 @@
#include <grabber/AmlogicWrapper.h>
AmlogicWrapper::AmlogicWrapper(unsigned grabWidth, unsigned grabHeight)
: GrabberWrapper("AmLogic", &_grabber, grabWidth, grabHeight)
, _grabber(grabWidth, grabHeight)
{}
AmlogicWrapper::AmlogicWrapper(int pixelDecimation, int updateRate_Hz)
: GrabberWrapper("Amlogic", &_grabber, updateRate_Hz)
, _grabber()
{
_grabber.setPixelDecimation(pixelDecimation);
}
void AmlogicWrapper::action()
{

View File

@@ -11,11 +11,35 @@
#define CAP_FLAG_AT_TIME_WINDOW 1
#define CAP_FLAG_AT_END 2
// #define AMVIDEOCAP_IOW_SET_WANTFRAME_FORMAT _IOW(AMVIDEOCAP_IOC_MAGIC, 0x01, int)
#define AMVIDEOCAP_IOW_SET_WANTFRAME_FORMAT _IOW(AMVIDEOCAP_IOC_MAGIC, 0x01, int)
#define AMVIDEOCAP_IOW_SET_WANTFRAME_WIDTH _IOW(AMVIDEOCAP_IOC_MAGIC, 0x02, int)
#define AMVIDEOCAP_IOW_SET_WANTFRAME_HEIGHT _IOW(AMVIDEOCAP_IOC_MAGIC, 0x03, int)
#define AMVIDEOCAP_IOW_SET_WANTFRAME_TIMESTAMP_MS _IOW(AMVIDEOCAP_IOC_MAGIC, 0x04, unsigned long long)
#define AMVIDEOCAP_IOW_SET_WANTFRAME_WAIT_MAX_MS _IOW(AMVIDEOCAP_IOC_MAGIC, 0x05, unsigned long long)
#define AMVIDEOCAP_IOW_SET_WANTFRAME_AT_FLAGS _IOW(AMVIDEOCAP_IOC_MAGIC, 0x06, int)
#define AMVIDEOCAP_IOR_GET_FRAME_FORMAT _IOR(AMVIDEOCAP_IOC_MAGIC, 0x10, int)
#define AMVIDEOCAP_IOR_GET_FRAME_WIDTH _IOR(AMVIDEOCAP_IOC_MAGIC, 0x11, int)
#define AMVIDEOCAP_IOR_GET_FRAME_HEIGHT _IOR(AMVIDEOCAP_IOC_MAGIC, 0x12, int)
#define AMVIDEOCAP_IOR_GET_FRAME_TIMESTAMP_MS _IOR(AMVIDEOCAP_IOC_MAGIC, 0x13, int)
#define AMVIDEOCAP_IOR_GET_SRCFRAME_FORMAT _IOR(AMVIDEOCAP_IOC_MAGIC, 0x20, int)
#define AMVIDEOCAP_IOR_GET_SRCFRAME_WIDTH _IOR(AMVIDEOCAP_IOC_MAGIC, 0x21, int)
#define AMVIDEOCAP_IOR_GET_SRCFRAME_HEIGHT _IOR(AMVIDEOCAP_IOC_MAGIC, 0x22, int)
#define AMVIDEOCAP_IOR_GET_STATE _IOR(AMVIDEOCAP_IOC_MAGIC, 0x31, int)
#define AMVIDEOCAP_IOW_SET_START_CAPTURE _IOW(AMVIDEOCAP_IOC_MAGIC, 0x32, int)
#define AMVIDEOCAP_IOW_SET_CANCEL_CAPTURE _IOW(AMVIDEOCAP_IOC_MAGIC, 0x33, int)
#define _A_M 'S'
#define AMSTREAM_IOC_GET_VIDEO_DISABLE _IOR((_A_M), 0x48, int)
#define AMSTREAM_IOC_GET_VIDEO_DISABLE _IOR((_A_M), 0x48, int)
#define AMVIDEOCAP_IOC_MAGIC 'V'
#define AMVIDEOCAP_IOW_SET_START_CAPTURE _IOW(AMVIDEOCAP_IOC_MAGIC, 0x32, int)
enum amvideocap_state{
AMVIDEOCAP_STATE_INIT=0,
AMVIDEOCAP_STATE_ON_CAPTURE=200,
AMVIDEOCAP_STATE_FINISHED_CAPTURE=300,
AMVIDEOCAP_STATE_ERROR=0xffff,
};

View File

@@ -4,9 +4,13 @@
#pragma comment(lib, "d3d9.lib")
#pragma comment(lib,"d3dx9.lib")
DirectXGrabber::DirectXGrabber(int cropLeft, int cropRight, int cropTop, int cropBottom, int pixelDecimation, int display)
: Grabber("DXGRABBER", 0, 0, cropLeft, cropRight, cropTop, cropBottom)
, _pixelDecimation(pixelDecimation)
// Constants
namespace {
const bool verbose = true;
} //End of constants
DirectXGrabber::DirectXGrabber(int display, int cropLeft, int cropRight, int cropTop, int cropBottom)
: Grabber("DXGRABBER", cropLeft, cropRight, cropTop, cropBottom)
, _display(unsigned(display))
, _displayWidth(0)
, _displayHeight(0)
@@ -15,8 +19,6 @@ DirectXGrabber::DirectXGrabber(int cropLeft, int cropRight, int cropTop, int cro
, _device(nullptr)
, _surface(nullptr)
{
// init
setupDisplay();
}
DirectXGrabber::~DirectXGrabber()
@@ -140,15 +142,24 @@ bool DirectXGrabber::setupDisplay()
int DirectXGrabber::grabFrame(Image<ColorRgb> & image)
{
if (!_enabled)
if (!_isEnabled)
{
qDebug() << "AUS";
return 0;
}
if (_device == nullptr)
{
// reinit, this will disable capture on failure
bool result = setupDisplay();
setEnabled(result);
return -1;
}
if (FAILED(_device->GetFrontBufferData(0, _surface)))
{
// reinit, this will disable capture on failure
Error(_log, "Unable to get Buffer Surface Data");
setEnabled(setupDisplay());
return -1;
return 0;
}
D3DXLoadSurfaceFromSurface(_surfaceDest, nullptr, nullptr, _surface, nullptr, _srcRect, D3DX_DEFAULT, 0);
@@ -181,22 +192,91 @@ void DirectXGrabber::setVideoMode(VideoMode mode)
setupDisplay();
}
void DirectXGrabber::setPixelDecimation(int pixelDecimation)
bool DirectXGrabber::setPixelDecimation(int pixelDecimation)
{
_pixelDecimation = pixelDecimation;
if(Grabber::setPixelDecimation(pixelDecimation))
return setupDisplay();
return false;
}
void DirectXGrabber::setCropping(unsigned cropLeft, unsigned cropRight, unsigned cropTop, unsigned cropBottom)
void DirectXGrabber::setCropping(int cropLeft, int cropRight, int cropTop, int cropBottom)
{
Grabber::setCropping(cropLeft, cropRight, cropTop, cropBottom);
setupDisplay();
}
void DirectXGrabber::setDisplayIndex(int index)
bool DirectXGrabber::setDisplayIndex(int index)
{
bool rc (true);
if(_display != unsigned(index))
{
_display = unsigned(index);
setupDisplay();
rc = setupDisplay();
}
return rc;
}
QJsonObject DirectXGrabber::discover(const QJsonObject& params)
{
DebugIf(verbose, _log, "params: [%s]", QString(QJsonDocument(params).toJson(QJsonDocument::Compact)).toUtf8().constData());
QJsonObject inputsDiscovered;
if ((_d3d9 = Direct3DCreate9(D3D_SDK_VERSION)) != nullptr)
{
int adapterCount = (int)_d3d9->GetAdapterCount();
if(adapterCount > 0)
{
inputsDiscovered["device"] = "dx";
inputsDiscovered["device_name"] = "DX";
inputsDiscovered["type"] = "screen";
QJsonArray video_inputs;
QJsonArray fps = { 1, 5, 10, 15, 20, 25, 30, 40, 50, 60 };
for(int adapter = 0; adapter < adapterCount; adapter++)
{
QJsonObject in;
in["inputIdx"] = adapter;
D3DADAPTER_IDENTIFIER9 identifier;
_d3d9->GetAdapterIdentifier(adapter, D3DENUM_WHQL_LEVEL, &identifier);
QString name = identifier.DeviceName;
int pos = name.lastIndexOf('\\');
if (pos != -1)
name = name.right(name.length()-pos-1);
in["name"] = name;
D3DDISPLAYMODE ddm;
_d3d9->GetAdapterDisplayMode(adapter, &ddm);
QJsonArray formats, resolutionArray;
QJsonObject format, resolution;
resolution["width"] = (int)ddm.Width;
resolution["height"] = (int)ddm.Height;
resolution["fps"] = fps;
resolutionArray.append(resolution);
format["resolutions"] = resolutionArray;
formats.append(format);
in["formats"] = formats;
video_inputs.append(in);
}
inputsDiscovered["video_inputs"] = video_inputs;
}
else
{
DebugIf(verbose, _log, "No displays found to capture from!");
}
}
DebugIf(verbose, _log, "device: [%s]", QString(QJsonDocument(inputsDiscovered).toJson(QJsonDocument::Compact)).toUtf8().constData());
return inputsDiscovered;
}

View File

@@ -1,9 +1,16 @@
#include <grabber/DirectXWrapper.h>
DirectXWrapper::DirectXWrapper(int cropLeft, int cropRight, int cropTop, int cropBottom, int pixelDecimation, int display, const unsigned updateRate_Hz)
: GrabberWrapper("DirectX", &_grabber, 0, 0, updateRate_Hz)
, _grabber(cropLeft, cropRight, cropTop, cropBottom, pixelDecimation, display)
{}
DirectXWrapper::DirectXWrapper( int updateRate_Hz,
int display,
int pixelDecimation,
int cropLeft, int cropRight, int cropTop, int cropBottom
)
: GrabberWrapper("DirectX", &_grabber, updateRate_Hz)
, _grabber(display, cropLeft, cropRight, cropTop, cropBottom)
{
_grabber.setPixelDecimation(pixelDecimation);
}
void DirectXWrapper::action()
{

View File

@@ -3,48 +3,34 @@
#include <cassert>
#include <iostream>
//Qt
#include <QJsonObject>
#include <QJsonArray>
#include <QJsonDocument>
#include <QSize>
// Constants
namespace {
const bool verbose = false;
const int DEFAULT_DEVICE = 0;
} //End of constants
// Local includes
#include "grabber/DispmanxFrameGrabber.h"
DispmanxFrameGrabber::DispmanxFrameGrabber(unsigned width, unsigned height)
: Grabber("DISPMANXGRABBER", 0, 0)
, _vc_display(0)
, _vc_resource(0)
, _vc_flags(0)
, _captureBuffer(new ColorRgba[0])
, _captureBufferSize(0)
, _image_rgba(width, height)
DispmanxFrameGrabber::DispmanxFrameGrabber()
: Grabber("DISPMANXGRABBER")
, _vc_display(0)
, _vc_resource(0)
, _vc_flags(DISPMANX_TRANSFORM_T(0))
, _captureBuffer(new ColorRgba[0])
, _captureBufferSize(0)
, _image_rgba()
{
_useImageResampler = false;
// Initiase BCM
_useImageResampler = true;
// Initialise BCM
bcm_host_init();
// Check if the display can be opened and display the current resolution
// Open the connection to the display
_vc_display = vc_dispmanx_display_open(0);
assert(_vc_display > 0);
// Obtain the display information
DISPMANX_MODEINFO_T vc_info;
int result = vc_dispmanx_display_get_info(_vc_display, &vc_info);
// Keep compiler happy in 'release' mode
(void)result;
// Close the display
vc_dispmanx_display_close(_vc_display);
if(result != 0)
{
Error(_log, "Failed to open display! Probably no permissions to access the capture interface");
setEnabled(false);
return;
}
else
Info(_log, "Display opened with resolution: %dx%d", vc_info.width, vc_info.height);
// init the resource and capture rectangle
setWidthHeight(width, height);
}
DispmanxFrameGrabber::~DispmanxFrameGrabber()
@@ -55,6 +41,28 @@ DispmanxFrameGrabber::~DispmanxFrameGrabber()
bcm_host_deinit();
}
bool DispmanxFrameGrabber::setupScreen()
{
bool rc (false);
int deviceIdx (DEFAULT_DEVICE);
QSize screenSize = getScreenSize(deviceIdx);
if ( screenSize.isEmpty() )
{
Error(_log, "Failed to open display [%d]! Probably no permissions to access the capture interface", deviceIdx);
setEnabled(false);
}
else
{
setWidthHeight(screenSize.width(), screenSize.height());
Info(_log, "Display [%d] opened with resolution: %dx%d", deviceIdx, screenSize.width(), screenSize.height());
setEnabled(true);
rc = true;
}
return rc;
}
void DispmanxFrameGrabber::freeResources()
{
delete[] _captureBuffer;
@@ -64,152 +72,219 @@ void DispmanxFrameGrabber::freeResources()
bool DispmanxFrameGrabber::setWidthHeight(int width, int height)
{
bool rc = false;
if(Grabber::setWidthHeight(width, height))
{
if(_vc_resource != 0)
if(_vc_resource != 0) {
vc_dispmanx_resource_delete(_vc_resource);
// Create the resources for capturing image
}
Debug(_log,"Create the resources for capturing image");
uint32_t vc_nativeImageHandle;
_vc_resource = vc_dispmanx_resource_create(
VC_IMAGE_RGBA32,
width,
height,
&vc_nativeImageHandle);
VC_IMAGE_RGBA32,
width,
height,
&vc_nativeImageHandle);
assert(_vc_resource);
// Define the capture rectangle with the same size
vc_dispmanx_rect_set(&_rectangle, 0, 0, width, height);
return true;
if (_vc_resource != 0)
{
Debug(_log,"Define the capture rectangle with the same size");
vc_dispmanx_rect_set(&_rectangle, 0, 0, width, height);
rc = true;
}
}
return false;
return rc;
}
void DispmanxFrameGrabber::setFlags(int vc_flags)
void DispmanxFrameGrabber::setFlags(DISPMANX_TRANSFORM_T vc_flags)
{
_vc_flags = vc_flags;
}
int DispmanxFrameGrabber::grabFrame(Image<ColorRgb> & image)
{
if (!_enabled) return 0;
int ret;
// vc_dispmanx_resource_read_data doesn't seem to work well
// with arbitrary positions so we have to handle cropping by ourselves
unsigned cropLeft = _cropLeft;
unsigned cropRight = _cropRight;
unsigned cropTop = _cropTop;
unsigned cropBottom = _cropBottom;
if (_vc_flags & DISPMANX_SNAPSHOT_FILL)
int rc = 0;
if (_isEnabled && !_isDeviceInError)
{
// disable cropping, we are capturing the video overlay window
cropLeft = cropRight = cropTop = cropBottom = 0;
}
// vc_dispmanx_resource_read_data doesn't seem to work well
// with arbitrary positions so we have to handle cropping by ourselves
int cropLeft = _cropLeft;
int cropRight = _cropRight;
int cropTop = _cropTop;
int cropBottom = _cropBottom;
unsigned imageWidth = _width - cropLeft - cropRight;
unsigned imageHeight = _height - cropTop - cropBottom;
// calculate final image dimensions and adjust top/left cropping in 3D modes
switch (_videoMode)
{
case VideoMode::VIDEO_3DSBS:
imageWidth /= 2;
cropLeft /= 2;
break;
case VideoMode::VIDEO_3DTAB:
imageHeight /= 2;
cropTop /= 2;
break;
case VideoMode::VIDEO_2D:
default:
break;
}
// resize the given image if needed
if (image.width() != imageWidth || image.height() != imageHeight)
{
image.resize(imageWidth, imageHeight);
}
if (_image_rgba.width() != imageWidth || _image_rgba.height() != imageHeight)
{
_image_rgba.resize(imageWidth, imageHeight);
}
// Open the connection to the display
_vc_display = vc_dispmanx_display_open(0);
if (_vc_display < 0)
{
Error(_log, "Cannot open display: %d", _vc_display);
return -1;
}
// Create the snapshot (incl down-scaling)
ret = vc_dispmanx_snapshot(_vc_display, _vc_resource, (DISPMANX_TRANSFORM_T) _vc_flags);
if (ret < 0)
{
Error(_log, "Snapshot failed: %d", ret);
vc_dispmanx_display_close(_vc_display);
return ret;
}
// Read the snapshot into the memory
void* imagePtr = _image_rgba.memptr();
void* capturePtr = imagePtr;
unsigned imagePitch = imageWidth * sizeof(ColorRgba);
// dispmanx seems to require the pitch to be a multiple of 64
unsigned capturePitch = (_rectangle.width * sizeof(ColorRgba) + 63) & (~63);
// grab to temp buffer if image pitch isn't valid or if we are cropping
if (imagePitch != capturePitch
|| (unsigned)_rectangle.width != imageWidth
|| (unsigned)_rectangle.height != imageHeight)
{
// check if we need to resize the capture buffer
unsigned captureSize = capturePitch * _rectangle.height / sizeof(ColorRgba);
if (_captureBufferSize != captureSize)
if (_vc_flags & DISPMANX_SNAPSHOT_FILL)
{
delete[] _captureBuffer;
_captureBuffer = new ColorRgba[captureSize];
_captureBufferSize = captureSize;
// disable cropping, we are capturing the video overlay window
Debug(_log,"Disable cropping, as the video overlay window is captured");
cropLeft = cropRight = cropTop = cropBottom = 0;
}
capturePtr = &_captureBuffer[0];
}
unsigned imageWidth = static_cast<unsigned>(_width - cropLeft - cropRight);
unsigned imageHeight = static_cast<unsigned>(_height - cropTop - cropBottom);
ret = vc_dispmanx_resource_read_data(_vc_resource, &_rectangle, capturePtr, capturePitch);
if (ret < 0)
{
Error(_log, "vc_dispmanx_resource_read_data failed: %d", ret);
vc_dispmanx_display_close(_vc_display);
return ret;
}
// copy capture data to image if we captured to temp buffer
if (imagePtr != capturePtr)
{
// adjust source pointer to top/left cropping
uint8_t* src_ptr = (uint8_t*) capturePtr
+ cropLeft * sizeof(ColorRgba)
+ cropTop * capturePitch;
for (unsigned y = 0; y < imageHeight; y++)
// resize the given image if needed
if (image.width() != imageWidth || image.height() != imageHeight)
{
memcpy((uint8_t*)imagePtr + y * imagePitch,
src_ptr + y * capturePitch,
imagePitch);
image.resize(imageWidth, imageHeight);
}
if (_image_rgba.width() != imageWidth || _image_rgba.height() != imageHeight)
{
_image_rgba.resize(imageWidth, imageHeight);
}
// Open the connection to the display
_vc_display = vc_dispmanx_display_open(DEFAULT_DEVICE);
if (_vc_display < 0)
{
Error(_log, "Cannot open display: %d", DEFAULT_DEVICE);
rc = -1;
}
else {
// Create the snapshot (incl down-scaling)
int ret = vc_dispmanx_snapshot(_vc_display, _vc_resource, _vc_flags);
if (ret < 0)
{
Error(_log, "Snapshot failed: %d", ret);
rc = ret;
}
else
{
// Read the snapshot into the memory
void* imagePtr = _image_rgba.memptr();
void* capturePtr = imagePtr;
unsigned imagePitch = imageWidth * sizeof(ColorRgba);
// dispmanx seems to require the pitch to be a multiple of 64
unsigned capturePitch = (_rectangle.width * sizeof(ColorRgba) + 63) & (~63);
// grab to temp buffer if image pitch isn't valid or if we are cropping
if (imagePitch != capturePitch
|| static_cast<unsigned>(_rectangle.width) != imageWidth
|| static_cast<unsigned>(_rectangle.height) != imageHeight)
{
// check if we need to resize the capture buffer
unsigned captureSize = capturePitch * static_cast<unsigned>(_rectangle.height) / sizeof(ColorRgba);
if (_captureBufferSize != captureSize)
{
delete[] _captureBuffer;
_captureBuffer = new ColorRgba[captureSize];
_captureBufferSize = captureSize;
}
capturePtr = &_captureBuffer[0];
}
ret = vc_dispmanx_resource_read_data(_vc_resource, &_rectangle, capturePtr, capturePitch);
if (ret < 0)
{
Error(_log, "vc_dispmanx_resource_read_data failed: %d", ret);
rc = ret;
}
else
{
_imageResampler.processImage(static_cast<uint8_t*>(capturePtr),
_width,
_height,
static_cast<int>(capturePitch),
PixelFormat::RGB32,
image);
}
}
vc_dispmanx_display_close(_vc_display);
}
}
// Close the displaye
vc_dispmanx_display_close(_vc_display);
// image to output image
_image_rgba.toRgb(image);
return 0;
return rc;
}
QSize DispmanxFrameGrabber::getScreenSize(int device) const
{
int width (0);
int height(0);
DISPMANX_DISPLAY_HANDLE_T vc_display = vc_dispmanx_display_open(device);
if ( vc_display > 0)
{
// Obtain the display information
DISPMANX_MODEINFO_T vc_info;
int result = vc_dispmanx_display_get_info(vc_display, &vc_info);
(void)result;
if (result == 0)
{
width = vc_info.width;
height = vc_info.height;
DebugIf(verbose, _log, "Display found with resolution: %dx%d", width, height);
}
// Close the display
vc_dispmanx_display_close(vc_display);
}
return QSize(width, height);
}
QJsonObject DispmanxFrameGrabber::discover(const QJsonObject& params)
{
DebugIf(verbose, _log, "params: [%s]", QString(QJsonDocument(params).toJson(QJsonDocument::Compact)).toUtf8().constData());
QJsonObject inputsDiscovered;
int deviceIdx (DEFAULT_DEVICE);
QJsonArray video_inputs;
QSize screenSize = getScreenSize(deviceIdx);
if ( !screenSize.isEmpty() )
{
QJsonArray fps = { 1, 5, 10, 15, 20, 25, 30, 40, 50, 60 };
QJsonObject in;
QString displayName;
displayName = QString("Screen:%1").arg(deviceIdx);
in["name"] = displayName;
in["inputIdx"] = deviceIdx;
QJsonArray formats;
QJsonObject format;
QJsonArray resolutionArray;
QJsonObject resolution;
resolution["width"] = screenSize.width();
resolution["height"] = screenSize.height();
resolution["fps"] = fps;
resolutionArray.append(resolution);
format["resolutions"] = resolutionArray;
formats.append(format);
in["formats"] = formats;
video_inputs.append(in);
}
if (!video_inputs.isEmpty())
{
inputsDiscovered["device"] = "dispmanx";
inputsDiscovered["device_name"] = "DispmanX";
inputsDiscovered["type"] = "screen";
inputsDiscovered["video_inputs"] = video_inputs;
}
if (inputsDiscovered.isEmpty())
{
DebugIf(verbose, _log, "No displays found to capture from!");
}
DebugIf(verbose, _log, "device: [%s]", QString(QJsonDocument(inputsDiscovered).toJson(QJsonDocument::Compact)).toUtf8().constData());
return inputsDiscovered;
}

View File

@@ -5,6 +5,7 @@
unsigned __bcm_frame_counter = 0;
const int __screenWidth = 800;
const int __screenHeight = 600;
const int __display_num = 0;
void bcm_host_init()
{
@@ -27,6 +28,7 @@ int vc_dispmanx_display_get_info(int, DISPMANX_MODEINFO_T *vc_info)
{
vc_info->width = __screenWidth;
vc_info->height = __screenHeight;
vc_info->display_num = __display_num;
return 0;
}
@@ -54,7 +56,7 @@ void vc_dispmanx_rect_set(VC_RECT_T *rectangle, int left, int top, int width, in
rectangle->top = top;
}
int vc_dispmanx_snapshot(int, DISPMANX_RESOURCE_HANDLE_T resource, int vc_flags)
int vc_dispmanx_snapshot(DISPMANX_DISPLAY_HANDLE_T /*display*/, DISPMANX_RESOURCE_HANDLE_T resource, DISPMANX_TRANSFORM_T /*vc_flags*/)
{
__bcm_frame_counter++;
if (__bcm_frame_counter > 100)
@@ -66,7 +68,7 @@ int vc_dispmanx_snapshot(int, DISPMANX_RESOURCE_HANDLE_T resource, int vc_flags)
if (__bcm_frame_counter < 25)
{
color[0] = ColorRgba::WHITE;
0 color[1] = ColorRgba::RED;
color[1] = ColorRgba::RED;
color[2] = ColorRgba::BLUE;
color[3] = ColorRgba::GREEN;
}

View File

@@ -1,10 +1,12 @@
#include <grabber/DispmanxWrapper.h>
DispmanxWrapper::DispmanxWrapper(unsigned grabWidth, unsigned grabHeight, unsigned updateRate_Hz)
: GrabberWrapper("Dispmanx", &_grabber, grabWidth, grabHeight, updateRate_Hz)
, _grabber(grabWidth, grabHeight)
DispmanxWrapper::DispmanxWrapper( int updateRate_Hz,
int pixelDecimation
)
: GrabberWrapper("Dispmanx", &_grabber, updateRate_Hz)
, _grabber()
{
_grabber.setPixelDecimation(pixelDecimation);
}
void DispmanxWrapper::action()

View File

@@ -10,102 +10,261 @@
// STL includes
#include <iostream>
//Qt
#include <QJsonObject>
#include <QJsonArray>
#include <QJsonDocument>
#include <QDir>
#include <QSize>
// Constants
namespace {
const bool verbose = false;
// fb discovery service
const char DISCOVERY_DIRECTORY[] = "/dev/";
const char DISCOVERY_FILEPATTERN[] = "fb?";
} //End of constants
// Local includes
#include <grabber/FramebufferFrameGrabber.h>
FramebufferFrameGrabber::FramebufferFrameGrabber(const QString & device, unsigned width, unsigned height)
: Grabber("FRAMEBUFFERGRABBER", width, height)
, _fbDevice()
FramebufferFrameGrabber::FramebufferFrameGrabber(const QString & device)
: Grabber("FRAMEBUFFERGRABBER")
, _fbDevice(device)
, _fbfd (-1)
{
setDevicePath(device);
_useImageResampler = true;
}
FramebufferFrameGrabber::~FramebufferFrameGrabber()
{
closeDevice();
}
bool FramebufferFrameGrabber::setupScreen()
{
bool rc (false);
if ( _fbfd >= 0 )
{
closeDevice();
}
rc = getScreenInfo();
setEnabled(rc);
return rc;
}
bool FramebufferFrameGrabber::setWidthHeight(int width, int height)
{
bool rc (false);
if(Grabber::setWidthHeight(width, height))
{
rc = setupScreen();
}
return rc;
}
int FramebufferFrameGrabber::grabFrame(Image<ColorRgb> & image)
{
if (!_enabled) return 0;
int rc = 0;
struct fb_var_screeninfo vinfo;
unsigned capSize, bytesPerPixel;
PixelFormat pixelFormat;
/* Open the framebuffer device */
int fbfd = open(QSTRING_CSTR(_fbDevice), O_RDONLY);
if (fbfd == -1)
if (_isEnabled && !_isDeviceInError)
{
Error(_log, "Error opening %s, %s : ", QSTRING_CSTR(_fbDevice), std::strerror(errno));
return -1;
}
/* get variable screen information */
ioctl (fbfd, FBIOGET_VSCREENINFO, &vinfo);
bytesPerPixel = vinfo.bits_per_pixel / 8;
capSize = vinfo.xres * vinfo.yres * bytesPerPixel;
switch (vinfo.bits_per_pixel)
{
case 16: pixelFormat = PixelFormat::BGR16; break;
case 24: pixelFormat = PixelFormat::BGR24; break;
#ifdef ENABLE_AMLOGIC
case 32: pixelFormat = PixelFormat::PIXELFORMAT_RGB32; break;
#else
case 32: pixelFormat = PixelFormat::BGR32; break;
#endif
default:
Error(_log, "Unknown pixel format: %d bits per pixel", vinfo.bits_per_pixel);
close(fbfd);
return -1;
}
/* map the device to memory */
unsigned char * fbp = (unsigned char*)mmap(0, capSize, PROT_READ, MAP_PRIVATE | MAP_NORESERVE, fbfd, 0);
if (fbp == MAP_FAILED) {
Error(_log, "Error mapping %s, %s : ", QSTRING_CSTR(_fbDevice), std::strerror(errno));
return -1;
}
_imageResampler.setHorizontalPixelDecimation(vinfo.xres/_width);
_imageResampler.setVerticalPixelDecimation(vinfo.yres/_height);
_imageResampler.processImage(fbp,
vinfo.xres,
vinfo.yres,
vinfo.xres * bytesPerPixel,
pixelFormat,
image);
munmap(fbp, capSize);
close(fbfd);
return 0;
}
void FramebufferFrameGrabber::setDevicePath(const QString& path)
{
if(_fbDevice != path)
{
_fbDevice = path;
int result;
struct fb_var_screeninfo vinfo;
// Check if the framebuffer device can be opened and display the current resolution
int fbfd = open(QSTRING_CSTR(_fbDevice), O_RDONLY);
if (fbfd == -1)
if ( getScreenInfo() )
{
Error(_log, "Error opening %s, %s : ", QSTRING_CSTR(_fbDevice), std::strerror(errno));
}
else
{
// get variable screen information
result = ioctl (fbfd, FBIOGET_VSCREENINFO, &vinfo);
if (result != 0)
{
Error(_log, "Could not get screen information, %s", std::strerror(errno));
/* map the device to memory */
uint8_t * fbp = static_cast<uint8_t*>(mmap(nullptr, _fixInfo.smem_len, PROT_READ, MAP_PRIVATE | MAP_NORESERVE, _fbfd, 0));
if (fbp == MAP_FAILED) {
QString errorReason = QString ("Error mapping %1, [%2] %3").arg(_fbDevice).arg(errno).arg(std::strerror(errno));
this->setInError ( errorReason );
closeDevice();
rc = -1;
}
else
{
Info(_log, "Display opened with resolution: %dx%d@%dbit", vinfo.xres, vinfo.yres, vinfo.bits_per_pixel);
_imageResampler.processImage(fbp,
static_cast<int>(_varInfo.xres),
static_cast<int>(_varInfo.yres),
static_cast<int>(_fixInfo.line_length),
_pixelFormat,
image);
munmap(fbp, _fixInfo.smem_len);
}
}
closeDevice();
}
return rc;
}
bool FramebufferFrameGrabber::openDevice()
{
bool rc = true;
/* Open the framebuffer device */
_fbfd = ::open(QSTRING_CSTR(_fbDevice), O_RDONLY);
if (_fbfd < 0)
{
QString errorReason = QString ("Error opening %1, [%2] %3").arg(_fbDevice).arg(errno).arg(std::strerror(errno));
this->setInError ( errorReason );
rc = false;
}
return rc;
}
bool FramebufferFrameGrabber::closeDevice()
{
bool rc = false;
if (_fbfd >= 0)
{
if( ::close(_fbfd) == 0) {
rc = true;
}
_fbfd = -1;
}
return rc;
}
QSize FramebufferFrameGrabber::getScreenSize() const
{
return getScreenSize(_fbDevice);
}
QSize FramebufferFrameGrabber::getScreenSize(const QString& device) const
{
int width (0);
int height(0);
int fbfd = ::open(QSTRING_CSTR(device), O_RDONLY);
if (fbfd != -1)
{
struct fb_var_screeninfo vinfo;
int result = ioctl (fbfd, FBIOGET_VSCREENINFO, &vinfo);
if (result == 0)
{
width = static_cast<int>(vinfo.xres);
height = static_cast<int>(vinfo.yres);
DebugIf(verbose, _log, "FB device [%s] found with resolution: %dx%d", QSTRING_CSTR(device), width, height);
}
::close(fbfd);
}
return QSize(width, height);
}
bool FramebufferFrameGrabber::getScreenInfo()
{
bool rc (false);
if ( openDevice() )
{
if (ioctl(_fbfd, FBIOGET_FSCREENINFO, &_fixInfo) < 0 || ioctl (_fbfd, FBIOGET_VSCREENINFO, &_varInfo) < 0)
{
QString errorReason = QString ("Error getting screen information for %1, [%2] %3").arg(_fbDevice).arg(errno).arg(std::strerror(errno));
this->setInError ( errorReason );
closeDevice();
}
else
{
rc = true;
switch (_varInfo.bits_per_pixel)
{
case 16: _pixelFormat = PixelFormat::BGR16;
break;
case 24: _pixelFormat = PixelFormat::BGR24;
break;
#ifdef ENABLE_AMLOGIC
case 32: _pixelFormat = PixelFormat::PIXELFORMAT_RGB32;
break;
#else
case 32: _pixelFormat = PixelFormat::BGR32;
break;
#endif
default:
rc= false;
QString errorReason = QString ("Unknown pixel format: %1 bits per pixel").arg(static_cast<int>(_varInfo.bits_per_pixel));
this->setInError ( errorReason );
closeDevice();
}
close(fbfd);
}
}
return rc;
}
QJsonObject FramebufferFrameGrabber::discover(const QJsonObject& params)
{
DebugIf(verbose, _log, "params: [%s]", QString(QJsonDocument(params).toJson(QJsonDocument::Compact)).toUtf8().constData());
QJsonObject inputsDiscovered;
//Find framebuffer devices 0-9
QDir deviceDirectory (DISCOVERY_DIRECTORY);
QStringList deviceFilter(DISCOVERY_FILEPATTERN);
deviceDirectory.setNameFilters(deviceFilter);
deviceDirectory.setSorting(QDir::Name);
QFileInfoList deviceFiles = deviceDirectory.entryInfoList(QDir::System);
int fbIdx (0);
QJsonArray video_inputs;
QFileInfoList::const_iterator deviceFileIterator;
for (deviceFileIterator = deviceFiles.constBegin(); deviceFileIterator != deviceFiles.constEnd(); ++deviceFileIterator)
{
fbIdx = (*deviceFileIterator).fileName().rightRef(1).toInt();
QString device = (*deviceFileIterator).absoluteFilePath();
DebugIf(verbose, _log, "FB device [%s] found", QSTRING_CSTR(device));
QSize screenSize = getScreenSize(device);
if ( !screenSize.isEmpty() )
{
QJsonArray fps = { "1", "5", "10", "15", "20", "25", "30", "40", "50", "60" };
QJsonObject in;
QString displayName;
displayName = QString("FB%1").arg(fbIdx);
in["name"] = displayName;
in["inputIdx"] = fbIdx;
QJsonArray formats;
QJsonObject format;
QJsonArray resolutionArray;
QJsonObject resolution;
resolution["width"] = screenSize.width();
resolution["height"] = screenSize.height();
resolution["fps"] = fps;
resolutionArray.append(resolution);
format["resolutions"] = resolutionArray;
formats.append(format);
in["formats"] = formats;
video_inputs.append(in);
}
if (!video_inputs.isEmpty())
{
inputsDiscovered["device"] = "framebuffer";
inputsDiscovered["device_name"] = "Framebuffer";
inputsDiscovered["type"] = "screen";
inputsDiscovered["video_inputs"] = video_inputs;
}
}
if (inputsDiscovered.isEmpty())
{
DebugIf(verbose, _log, "No displays found to capture from!");
}
DebugIf(verbose, _log, "device: [%s]", QString(QJsonDocument(inputsDiscovered).toJson(QJsonDocument::Compact)).toUtf8().constData());
return inputsDiscovered;
}

View File

@@ -1,9 +1,13 @@
#include <grabber/FramebufferWrapper.h>
FramebufferWrapper::FramebufferWrapper(const QString & device, unsigned grabWidth, unsigned grabHeight, unsigned updateRate_Hz)
: GrabberWrapper("FrameBuffer", &_grabber, grabWidth, grabHeight, updateRate_Hz)
, _grabber(device, grabWidth, grabHeight)
{}
FramebufferWrapper::FramebufferWrapper( int updateRate_Hz,
const QString & device,
int pixelDecimation)
: GrabberWrapper("FrameBuffer", &_grabber, updateRate_Hz)
, _grabber(device)
{
_grabber.setPixelDecimation(pixelDecimation);
}
void FramebufferWrapper::action()
{

View File

@@ -5,94 +5,204 @@
// Local includes
#include <grabber/OsxFrameGrabber.h>
OsxFrameGrabber::OsxFrameGrabber(unsigned display, unsigned width, unsigned height)
: Grabber("OSXGRABBER", width, height)
, _screenIndex(100)
//Qt
#include <QJsonObject>
#include <QJsonArray>
#include <QJsonDocument>
// Constants
namespace {
const bool verbose = false;
} //End of constants
OsxFrameGrabber::OsxFrameGrabber(int display)
: Grabber("OSXGRABBER")
, _screenIndex(display)
{
// check if display is available
setDisplayIndex(display);
_isEnabled = false;
_useImageResampler = true;
}
OsxFrameGrabber::~OsxFrameGrabber()
{
}
int OsxFrameGrabber::grabFrame(Image<ColorRgb> & image)
bool OsxFrameGrabber::setupDisplay()
{
if (!_enabled) return 0;
bool rc (false);
CGImageRef dispImage;
CFDataRef imgData;
unsigned char * pImgData;
unsigned dspWidth, dspHeight;
rc = setDisplayIndex(_screenIndex);
dispImage = CGDisplayCreateImage(_display);
// display lost, use main
if (dispImage == NULL && _display)
{
dispImage = CGDisplayCreateImage(kCGDirectMainDisplay);
// no displays connected, return
if (dispImage == NULL)
{
Error(_log, "No display connected...");
return -1;
}
}
imgData = CGDataProviderCopyData(CGImageGetDataProvider(dispImage));
pImgData = (unsigned char*) CFDataGetBytePtr(imgData);
dspWidth = CGImageGetWidth(dispImage);
dspHeight = CGImageGetHeight(dispImage);
_imageResampler.setHorizontalPixelDecimation(dspWidth/_width);
_imageResampler.setVerticalPixelDecimation(dspHeight/_height);
_imageResampler.processImage( pImgData,
dspWidth,
dspHeight,
CGImageGetBytesPerRow(dispImage),
PixelFormat::BGR32,
image);
CFRelease(imgData);
CGImageRelease(dispImage);
return 0;
return rc;
}
void OsxFrameGrabber::setDisplayIndex(int index)
int OsxFrameGrabber::grabFrame(Image<ColorRgb> & image)
{
if(_screenIndex != index)
int rc = 0;
if (_isEnabled && !_isDeviceInError)
{
CGImageRef dispImage;
CFDataRef imgData;
unsigned char * pImgData;
unsigned dspWidth;
unsigned dspHeight;
dispImage = CGDisplayCreateImage(_display);
// display lost, use main
if (dispImage == nullptr && _display != 0)
{
dispImage = CGDisplayCreateImage(kCGDirectMainDisplay);
// no displays connected, return
if (dispImage == nullptr)
{
Error(_log, "No display connected...");
return -1;
}
}
imgData = CGDataProviderCopyData(CGImageGetDataProvider(dispImage));
pImgData = (unsigned char*) CFDataGetBytePtr(imgData);
dspWidth = CGImageGetWidth(dispImage);
dspHeight = CGImageGetHeight(dispImage);
_imageResampler.processImage( pImgData,
static_cast<int>(dspWidth),
static_cast<int>(dspHeight),
static_cast<int>(CGImageGetBytesPerRow(dispImage)),
PixelFormat::BGR32,
image);
CFRelease(imgData);
CGImageRelease(dispImage);
}
return rc;
}
bool OsxFrameGrabber::setDisplayIndex(int index)
{
bool rc (true);
if(_screenIndex != index || !_isEnabled)
{
_screenIndex = index;
CGImageRef image;
CGDisplayCount displayCount;
CGDirectDisplayID displays[8];
// get list of displays
CGGetActiveDisplayList(8, displays, &displayCount);
if (_screenIndex + 1 > displayCount)
CGDisplayCount dspyCnt = 0 ;
CGDisplayErr err;
err = CGGetActiveDisplayList(0, nullptr, &dspyCnt);
if (err == kCGErrorSuccess && dspyCnt > 0)
{
Error(_log, "Display with index %d is not available. Using main display", _screenIndex);
_display = kCGDirectMainDisplay;
CGDirectDisplayID *activeDspys = new CGDirectDisplayID [dspyCnt] ;
err = CGGetActiveDisplayList(dspyCnt, activeDspys, &dspyCnt) ;
if (err == kCGErrorSuccess)
{
CGImageRef image;
if (_screenIndex + 1 > static_cast<int>(dspyCnt))
{
Error(_log, "Display with index %d is not available.", _screenIndex);
rc = false;
}
else
{
_display = activeDspys[_screenIndex];
image = CGDisplayCreateImage(_display);
if(image == nullptr)
{
setEnabled(false);
Error(_log, "Failed to open main display, disable capture interface");
rc = false;
}
else
{
setEnabled(true);
rc = true;
Info(_log, "Display [%u] opened with resolution: %ux%u@%ubit", _display, CGImageGetWidth(image), CGImageGetHeight(image), CGImageGetBitsPerPixel(image));
}
CGImageRelease(image);
}
}
}
else
{
_display = displays[_screenIndex];
rc=false;
}
image = CGDisplayCreateImage(_display);
if(image == NULL)
{
Error(_log, "Failed to open main display, disable capture interface");
setEnabled(false);
return;
}
else
setEnabled(true);
Info(_log, "Display opened with resolution: %dx%d@%dbit", CGImageGetWidth(image), CGImageGetHeight(image), CGImageGetBitsPerPixel(image));
CGImageRelease(image);
}
return rc;
}
QJsonObject OsxFrameGrabber::discover(const QJsonObject& params)
{
DebugIf(verbose, _log, "params: [%s]", QString(QJsonDocument(params).toJson(QJsonDocument::Compact)).toUtf8().constData());
QJsonObject inputsDiscovered;
// get list of displays
CGDisplayCount dspyCnt = 0 ;
CGDisplayErr err;
err = CGGetActiveDisplayList(0, nullptr, &dspyCnt);
if (err == kCGErrorSuccess && dspyCnt > 0)
{
CGDirectDisplayID *activeDspys = new CGDirectDisplayID [dspyCnt] ;
err = CGGetActiveDisplayList(dspyCnt, activeDspys, &dspyCnt) ;
if (err == kCGErrorSuccess)
{
inputsDiscovered["device"] = "osx";
inputsDiscovered["device_name"] = "OSX";
inputsDiscovered["type"] = "screen";
QJsonArray video_inputs;
QJsonArray fps = { 1, 5, 10, 15, 20, 25, 30, 40, 50, 60 };
for (int i = 0; i < static_cast<int>(dspyCnt); ++i)
{
QJsonObject in;
CGDirectDisplayID did = activeDspys[i];
QString displayName;
displayName = QString("Display:%1").arg(did);
in["name"] = displayName;
in["inputIdx"] = i;
QJsonArray formats;
QJsonObject format;
QJsonArray resolutionArray;
QJsonObject resolution;
CGDisplayModeRef dispMode = CGDisplayCopyDisplayMode(did);
CGRect rect = CGDisplayBounds(did);
resolution["width"] = static_cast<int>(rect.size.width);
resolution["height"] = static_cast<int>(rect.size.height);
CGDisplayModeRelease(dispMode);
resolution["fps"] = fps;
resolutionArray.append(resolution);
format["resolutions"] = resolutionArray;
formats.append(format);
in["formats"] = formats;
video_inputs.append(in);
}
inputsDiscovered["video_inputs"] = video_inputs;
}
delete [] activeDspys;
}
if (inputsDiscovered.isEmpty())
{
DebugIf(verbose, _log, "No displays found to capture from!");
}
DebugIf(verbose, _log, "device: [%s]", QString(QJsonDocument(inputsDiscovered).toJson(QJsonDocument::Compact)).toUtf8().constData());
return inputsDiscovered;
}

View File

@@ -5,15 +5,33 @@ unsigned __osx_frame_counter = 0;
const int __screenWidth = 800;
const int __screenHeight = 600;
void CGGetActiveDisplayList(int max, CGDirectDisplayID *displays, CGDisplayCount *displayCount)
CGError CGGetActiveDisplayList(uint32_t maxDisplays, CGDirectDisplayID *activeDisplays, uint32_t *displayCount)
{
*displayCount = 1;
displays[0] = 1;
if (maxDisplays == 0 || activeDisplays == nullptr)
{
*displayCount = 2;
}
else
{
displayCount = &maxDisplays;
if (activeDisplays != nullptr)
{
for (CGDirectDisplayID i = 0; i < maxDisplays; ++i)
{
activeDisplays[i] = i;
}
}
else
{
return kCGErrorFailure;
}
}
return kCGErrorSuccess;
}
CGImageRef CGDisplayCreateImage(CGDirectDisplayID display)
{
CGImageRef image = new CGImage(__screenWidth, __screenHeight);
CGImageRef image = new CGImage(__screenWidth / (display+1), __screenHeight / (display+1));
return image;
}
@@ -123,4 +141,19 @@ void CFRelease(CFDataRef imgData)
delete imgData;
}
CGDisplayModeRef CGDisplayCopyDisplayMode(CGDirectDisplayID display)
{
return nullptr;
}
CGRect CGDisplayBounds(CGDirectDisplayID display)
{
CGRect rect;
rect.size.width = __screenWidth / (display+1);
rect.size.height = __screenHeight / (display+1);
return rect;
}
void CGDisplayModeRelease(CGDisplayModeRef mode)
{
}
#endif

View File

@@ -1,9 +1,14 @@
#include <grabber/OsxWrapper.h>
OsxWrapper::OsxWrapper(unsigned display, unsigned grabWidth, unsigned grabHeight, unsigned updateRate_Hz)
: GrabberWrapper("OSX FrameGrabber", &_grabber, grabWidth, grabHeight, updateRate_Hz)
, _grabber(display, grabWidth, grabHeight)
{}
OsxWrapper::OsxWrapper( int updateRate_Hz,
int display,
int pixelDecimation
)
: GrabberWrapper("OSX", &_grabber, updateRate_Hz)
, _grabber(display)
{
_grabber.setPixelDecimation(pixelDecimation);
}
void OsxWrapper::action()
{

View File

@@ -7,23 +7,30 @@
#include <QGuiApplication>
#include <QWidget>
#include <QScreen>
#include <QJsonObject>
#include <QJsonArray>
#include <QJsonDocument>
QtGrabber::QtGrabber(int cropLeft, int cropRight, int cropTop, int cropBottom, int pixelDecimation, int display)
: Grabber("QTGRABBER", 0, 0, cropLeft, cropRight, cropTop, cropBottom)
, _display(unsigned(display))
, _pixelDecimation(pixelDecimation)
, _screenWidth(0)
, _screenHeight(0)
, _src_x(0)
, _src_y(0)
, _src_x_max(0)
, _src_y_max(0)
, _screen(nullptr)
// Constants
namespace {
const bool verbose = false;
} //End of constants
QtGrabber::QtGrabber(int display, int cropLeft, int cropRight, int cropTop, int cropBottom)
: Grabber("QTGRABBER", cropLeft, cropRight, cropTop, cropBottom)
, _display(display)
, _calculatedWidth(0)
, _calculatedHeight(0)
, _src_x(0)
, _src_y(0)
, _src_x_max(0)
, _src_y_max(0)
, _isWayland(false)
, _screen(nullptr)
, _isVirtual(false)
{
_logger = Logger::getInstance("Qt");
_useImageResampler = false;
// init
setupDisplay();
}
QtGrabber::~QtGrabber()
@@ -36,51 +43,111 @@ void QtGrabber::freeResources()
// Qt seems to hold the ownership of the QScreen pointers
}
bool QtGrabber::open()
{
bool rc = false;
#ifndef _WIN32
if (getenv("WAYLAND_DISPLAY") != nullptr)
{
_isWayland = true;
}
else
#endif
{
rc = true;
}
return rc;
}
bool QtGrabber::setupDisplay()
{
// cleanup last screen
freeResources();
QScreen* primary = QGuiApplication::primaryScreen();
QList<QScreen *> screens = QGuiApplication::screens();
// inject main screen at 0, if not nullptr
if(primary != nullptr)
bool result = false;
if ( ! open() )
{
screens.prepend(primary);
// remove last main screen if twice in list
if(screens.lastIndexOf(primary) > 0)
screens.removeAt(screens.lastIndexOf(primary));
if ( _isWayland )
{
Error(_log, "Grabber does not work under Wayland!");
}
}
if(screens.isEmpty())
else
{
Error(_log, "No displays found to capture from!");
return false;
// cleanup last screen
freeResources();
_numberOfSDisplays = 0;
QScreen* primary = QGuiApplication::primaryScreen();
QList<QScreen *> screens = QGuiApplication::screens();
// inject main screen at 0, if not nullptr
if(primary != nullptr)
{
screens.prepend(primary);
// remove last main screen if twice in list
if(screens.lastIndexOf(primary) > 0)
{
screens.removeAt(screens.lastIndexOf(primary));
}
}
if(screens.isEmpty())
{
Error(_log, "No displays found to capture from!");
result = false;
}
else
{
_numberOfSDisplays = screens.size();
Info(_log,"Available Displays:");
int index = 0;
for(auto * screen : qAsConst(screens))
{
const QRect geo = screen->geometry();
Info(_log,"Display %d: Name: %s Geometry: (L,T,R,B) %d,%d,%d,%d Depth:%dbit", index, QSTRING_CSTR(screen->name()), geo.left(), geo.top() ,geo.right(), geo.bottom(), screen->depth());
++index;
}
if (screens.at(0)->size() != screens.at(0)->virtualSize())
{
const QRect vgeo = screens.at(0)->virtualGeometry();
Info(_log,"Display %d: Name: %s Geometry: (L,T,R,B) %d,%d,%d,%d Depth:%dbit", _numberOfSDisplays, "All Displays", vgeo.left(), vgeo.top() ,vgeo.right(), vgeo.bottom(), screens.at(0)->depth());
}
_isVirtual = false;
// be sure the index is available
if (_display > _numberOfSDisplays - 1 )
{
if ((screens.at(0)->size() != screens.at(0)->virtualSize()) && (_display == _numberOfSDisplays))
{
_isVirtual = true;
_display = 0;
}
else
{
Info(_log, "The requested display index '%d' is not available, falling back to display 0", _display);
_display = 0;
}
}
// init the requested display
_screen = screens.at(_display);
connect(_screen, &QScreen::geometryChanged, this, &QtGrabber::geometryChanged);
updateScreenDimensions(true);
if (_isVirtual)
{
Info(_log, "Using virtual display across all screens");
}
else
{
Info(_log,"Initialized display %d", _display);
}
result = true;
}
}
Info(_log,"Available Displays:");
int index = 0;
for(auto screen : screens)
{
const QRect geo = screen->geometry();
Info(_log,"Display %d: Name:%s Geometry: (L,T,R,B) %d,%d,%d,%d Depth:%dbit", index, QSTRING_CSTR(screen->name()), geo.left(), geo.top() ,geo.right(), geo.bottom(), screen->depth());
index++;
}
// be sure the index is available
if(_display > unsigned(screens.size()-1))
{
Info(_log, "The requested display index '%d' is not available, falling back to display 0", _display);
_display = 0;
}
// init the requested display
_screen = screens.at(_display);
connect(_screen, &QScreen::geometryChanged, this, &QtGrabber::geometryChanged);
updateScreenDimensions(true);
Info(_log,"Initialized display %d", _display);
return true;
return result;
}
void QtGrabber::geometryChanged(const QRect &geo)
@@ -91,90 +158,109 @@ void QtGrabber::geometryChanged(const QRect &geo)
int QtGrabber::grabFrame(Image<ColorRgb> & image)
{
if (!_enabled) return 0;
if(_screen == nullptr)
int rc = 0;
if (_isEnabled && !_isDeviceInError)
{
// reinit, this will disable capture on failure
setEnabled(setupDisplay());
return -1;
}
QPixmap originalPixmap = _screen->grabWindow(0, _src_x, _src_y, _src_x_max, _src_y_max);
QPixmap resizedPixmap = originalPixmap.scaled(_width,_height);
QImage imageFrame = resizedPixmap.toImage().convertToFormat( QImage::Format_RGB888);
image.resize(imageFrame.width(), imageFrame.height());
for (int y=0; y<imageFrame.height(); ++y)
for (int x=0; x<imageFrame.width(); ++x)
if(_screen == nullptr)
{
QColor inPixel(imageFrame.pixel(x,y));
ColorRgb & outPixel = image(x,y);
outPixel.red = inPixel.red();
outPixel.green = inPixel.green();
outPixel.blue = inPixel.blue();
// reinit, this will disable capture on failure
bool result = setupDisplay();
setEnabled(result);
}
return 0;
if (_isEnabled)
{
QPixmap originalPixmap = _screen->grabWindow(0, _src_x, _src_y, _src_x_max, _src_y_max);
if (originalPixmap.isNull())
{
rc = -1;
}
else
{
QImage imageFrame = originalPixmap.toImage().scaled(_calculatedWidth, _calculatedHeight).convertToFormat( QImage::Format_RGB888);
image.resize(static_cast<uint>(_calculatedWidth), static_cast<uint>(_calculatedHeight));
for (int y = 0; y < imageFrame.height(); y++)
{
memcpy((unsigned char*)image.memptr() + y * image.width() * 3, static_cast<unsigned char*>(imageFrame.scanLine(y)), imageFrame.width() * 3);
}
}
}
}
return rc;
}
int QtGrabber::updateScreenDimensions(bool force)
{
if(!_screen)
if(_screen == nullptr)
{
return -1;
}
const QRect& geo = _screen->geometry();
if (!force && _screenWidth == unsigned(geo.right()) && _screenHeight == unsigned(geo.bottom()))
QRect geo;
if (_isVirtual)
{
geo = _screen->virtualGeometry();
}
else
{
geo = _screen->geometry();
}
if (!force && _width == geo.width() && _height == geo.height())
{
// No update required
return 0;
}
Info(_log, "Update of screen resolution: [%dx%d] to [%dx%d]", _screenWidth, _screenHeight, geo.right(), geo.bottom());
_screenWidth = geo.right() - geo.left();
_screenHeight = geo.bottom() - geo.top();
Info(_log, "Update of screen resolution: [%dx%d] to [%dx%d]", _width, _height, geo.width(), geo.height());
_width = geo.width();
_height = geo.height();
int width=0, height=0;
int width=0;
int height=0;
// Image scaling is performed by Qt
width = (_screenWidth > unsigned(_cropLeft + _cropRight))
? ((_screenWidth - _cropLeft - _cropRight) / _pixelDecimation)
: (_screenWidth / _pixelDecimation);
width = (_width > (_cropLeft + _cropRight))
? ((_width - _cropLeft - _cropRight) / _pixelDecimation)
: (_width / _pixelDecimation);
height = (_screenHeight > unsigned(_cropTop + _cropBottom))
? ((_screenHeight - _cropTop - _cropBottom) / _pixelDecimation)
: (_screenHeight / _pixelDecimation);
height = (_height > (_cropTop + _cropBottom))
? ((_height - _cropTop - _cropBottom) / _pixelDecimation)
: (_height / _pixelDecimation);
// calculate final image dimensions and adjust top/left cropping in 3D modes
switch (_videoMode)
{
case VideoMode::VIDEO_3DSBS:
_width = width /2;
_height = height;
_calculatedWidth = width /2;
_calculatedHeight = height;
_src_x = _cropLeft / 2;
_src_y = _cropTop;
_src_x_max = (_screenWidth / 2) - _cropRight;
_src_y_max = _screenHeight - _cropBottom;
_src_x_max = (_width / 2) - _cropRight - _cropLeft;
_src_y_max = _height - _cropBottom - _cropTop;
break;
case VideoMode::VIDEO_3DTAB:
_width = width;
_height = height / 2;
_calculatedWidth = width;
_calculatedHeight = height / 2;
_src_x = _cropLeft;
_src_y = _cropTop / 2;
_src_x_max = _screenWidth - _cropRight;
_src_y_max = (_screenHeight / 2) - _cropBottom;
_src_x_max = _width - _cropRight - _cropLeft;
_src_y_max = (_height / 2) - _cropBottom - _cropTop;
break;
case VideoMode::VIDEO_2D:
default:
_width = width;
_height = height;
_calculatedWidth = width;
_calculatedHeight = height;
_src_x = _cropLeft;
_src_y = _cropTop;
_src_x_max = _screenWidth - _cropRight;
_src_y_max = _screenHeight - _cropBottom;
_src_x_max = _width - _cropRight - _cropLeft;
_src_y_max = _height - _cropBottom - _cropTop;
break;
}
Info(_log, "Update output image resolution to [%dx%d]", _width, _height);
Info(_log, "Update output image resolution to [%dx%d]", _calculatedWidth, _calculatedHeight);
return 1;
}
@@ -184,22 +270,129 @@ void QtGrabber::setVideoMode(VideoMode mode)
updateScreenDimensions(true);
}
void QtGrabber::setPixelDecimation(int pixelDecimation)
bool QtGrabber::setPixelDecimation(int pixelDecimation)
{
_pixelDecimation = pixelDecimation;
bool rc (true);
if(Grabber::setPixelDecimation(pixelDecimation))
{
if ( updateScreenDimensions(true) < 0)
{
rc = false;
}
}
return rc;
}
void QtGrabber::setCropping(unsigned cropLeft, unsigned cropRight, unsigned cropTop, unsigned cropBottom)
void QtGrabber::setCropping(int cropLeft, int cropRight, int cropTop, int cropBottom)
{
Grabber::setCropping(cropLeft, cropRight, cropTop, cropBottom);
updateScreenDimensions(true);
}
void QtGrabber::setDisplayIndex(int index)
bool QtGrabber::setDisplayIndex(int index)
{
if(_display != unsigned(index))
bool rc (true);
if (_display != index)
{
_display = unsigned(index);
setupDisplay();
if (index <= _numberOfSDisplays)
{
_display = index;
}
else {
_display = 0;
}
rc = setupDisplay();
}
return rc;
}
QJsonObject QtGrabber::discover(const QJsonObject& params)
{
DebugIf(verbose, _log, "params: [%s]", QString(QJsonDocument(params).toJson(QJsonDocument::Compact)).toUtf8().constData());
QJsonObject inputsDiscovered;
if ( open() )
{
QList<QScreen*> screens = QGuiApplication::screens();
if (!screens.isEmpty())
{
inputsDiscovered["device"] = "qt";
inputsDiscovered["device_name"] = "QT";
inputsDiscovered["type"] = "screen";
QJsonArray video_inputs;
QJsonArray fps = { 1, 5, 10, 15, 20, 25, 30, 40, 50, 60 };
for (int i = 0; i < screens.size(); ++i)
{
QJsonObject in;
QString name = screens.at(i)->name();
int pos = name.lastIndexOf('\\');
if (pos != -1)
{
name = name.right(name.length()-pos-1);
}
in["name"] = name;
in["inputIdx"] = i;
QJsonArray formats;
QJsonObject format;
QJsonArray resolutionArray;
QJsonObject resolution;
resolution["width"] = screens.at(i)->size().width();
resolution["height"] = screens.at(i)->size().height();
resolution["fps"] = fps;
resolutionArray.append(resolution);
format["resolutions"] = resolutionArray;
formats.append(format);
in["formats"] = formats;
video_inputs.append(in);
}
if (screens.at(0)->size() != screens.at(0)->virtualSize())
{
QJsonObject in;
in["name"] = "All Displays";
in["inputIdx"] = screens.size();
in["virtual"] = true;
QJsonArray formats;
QJsonObject format;
QJsonArray resolutionArray;
QJsonObject resolution;
resolution["width"] = screens.at(0)->virtualSize().width();
resolution["height"] = screens.at(0)->virtualSize().height();
resolution["fps"] = fps;
resolutionArray.append(resolution);
format["resolutions"] = resolutionArray;
formats.append(format);
in["formats"] = formats;
video_inputs.append(in);
}
inputsDiscovered["video_inputs"] = video_inputs;
}
if (inputsDiscovered.isEmpty())
{
DebugIf(verbose, _log, "No displays found to capture from!");
}
}
DebugIf(verbose, _log, "device: [%s]", QString(QJsonDocument(inputsDiscovered).toJson(QJsonDocument::Compact)).toUtf8().constData());
return inputsDiscovered;
}

View File

@@ -1,9 +1,20 @@
#include <grabber/QtWrapper.h>
QtWrapper::QtWrapper(int cropLeft, int cropRight, int cropTop, int cropBottom, int pixelDecimation, int display, unsigned updateRate_Hz)
: GrabberWrapper("Qt", &_grabber, 0, 0, updateRate_Hz)
, _grabber(cropLeft, cropRight, cropTop, cropBottom, pixelDecimation, display)
{}
QtWrapper::QtWrapper( int updateRate_Hz,
int display,
int pixelDecimation,
int cropLeft, int cropRight, int cropTop, int cropBottom
)
: GrabberWrapper("Qt", &_grabber, updateRate_Hz)
, _grabber(display, cropLeft, cropRight, cropTop, cropBottom)
{
_grabber.setPixelDecimation(pixelDecimation);
}
bool QtWrapper::open()
{
return _grabber.open();
}
void QtWrapper::action()
{

View File

@@ -1,18 +0,0 @@
# Define the current source locations
SET(CURRENT_HEADER_DIR ${CMAKE_SOURCE_DIR}/include/grabber)
SET(CURRENT_SOURCE_DIR ${CMAKE_SOURCE_DIR}/libsrc/grabber/v4l2)
FILE ( GLOB V4L2_SOURCES "${CURRENT_HEADER_DIR}/V4L2*.h" "${CURRENT_SOURCE_DIR}/*.h" "${CURRENT_SOURCE_DIR}/*.cpp" )
add_library(v4l2-grabber ${V4L2_SOURCES} )
target_link_libraries(v4l2-grabber
hyperion
${QT_LIBRARIES}
)
if(TURBOJPEG_FOUND)
target_link_libraries(v4l2-grabber ${TurboJPEG_LIBRARY})
elseif (JPEG_FOUND)
target_link_libraries(v4l2-grabber ${JPEG_LIBRARY})
endif(TURBOJPEG_FOUND)

File diff suppressed because it is too large Load Diff

View File

@@ -1,156 +0,0 @@
#include <QMetaType>
#include <grabber/V4L2Wrapper.h>
// qt
#include <QTimer>
V4L2Wrapper::V4L2Wrapper(const QString &device,
unsigned grabWidth,
unsigned grabHeight,
unsigned fps,
unsigned input,
VideoStandard videoStandard,
PixelFormat pixelFormat,
int pixelDecimation )
: GrabberWrapper("V4L2:"+device, &_grabber, grabWidth, grabHeight, 10)
, _grabber(device,
grabWidth,
grabHeight,
fps,
input,
videoStandard,
pixelFormat,
pixelDecimation)
{
_ggrabber = &_grabber;
// register the image type
qRegisterMetaType<Image<ColorRgb>>("Image<ColorRgb>");
// Handle the image in the captured thread using a direct connection
connect(&_grabber, &V4L2Grabber::newFrame, this, &V4L2Wrapper::newFrame, Qt::DirectConnection);
connect(&_grabber, &V4L2Grabber::readError, this, &V4L2Wrapper::readError, Qt::DirectConnection);
}
V4L2Wrapper::~V4L2Wrapper()
{
stop();
}
bool V4L2Wrapper::start()
{
return ( _grabber.start() && GrabberWrapper::start());
}
void V4L2Wrapper::stop()
{
_grabber.stop();
GrabberWrapper::stop();
}
void V4L2Wrapper::setSignalThreshold(double redSignalThreshold, double greenSignalThreshold, double blueSignalThreshold)
{
_grabber.setSignalThreshold( redSignalThreshold, greenSignalThreshold, blueSignalThreshold, 50);
}
void V4L2Wrapper::setCropping(unsigned cropLeft, unsigned cropRight, unsigned cropTop, unsigned cropBottom)
{
_grabber.setCropping(cropLeft, cropRight, cropTop, cropBottom);
}
void V4L2Wrapper::setSignalDetectionOffset(double verticalMin, double horizontalMin, double verticalMax, double horizontalMax)
{
_grabber.setSignalDetectionOffset(verticalMin, horizontalMin, verticalMax, horizontalMax);
}
void V4L2Wrapper::newFrame(const Image<ColorRgb> &image)
{
emit systemImage(_grabberName, image);
}
void V4L2Wrapper::readError(const char* err)
{
Error(_log, "stop grabber, because reading device failed. (%s)", err);
stop();
}
void V4L2Wrapper::action()
{
// dummy as v4l get notifications from stream
}
void V4L2Wrapper::setSignalDetectionEnable(bool enable)
{
_grabber.setSignalDetectionEnable(enable);
}
bool V4L2Wrapper::getSignalDetectionEnable() const
{
return _grabber.getSignalDetectionEnabled();
}
void V4L2Wrapper::setCecDetectionEnable(bool enable)
{
_grabber.setCecDetectionEnable(enable);
}
bool V4L2Wrapper::getCecDetectionEnable() const
{
return _grabber.getCecDetectionEnabled();
}
void V4L2Wrapper::setDeviceVideoStandard(const QString& device, VideoStandard videoStandard)
{
_grabber.setDeviceVideoStandard(device, videoStandard);
}
void V4L2Wrapper::handleCecEvent(CECEvent event)
{
_grabber.handleCecEvent(event);
}
void V4L2Wrapper::handleSettingsUpdate(settings::type type, const QJsonDocument& config)
{
if(type == settings::V4L2 && _grabberName.startsWith("V4L"))
{
// extract settings
const QJsonObject& obj = config.object();
// pixel decimation for v4l
_grabber.setPixelDecimation(obj["sizeDecimation"].toInt(8));
// crop for v4l
_grabber.setCropping(
obj["cropLeft"].toInt(0),
obj["cropRight"].toInt(0),
obj["cropTop"].toInt(0),
obj["cropBottom"].toInt(0));
// device input
_grabber.setInput(obj["input"].toInt(-1));
// device resolution
_grabber.setWidthHeight(obj["width"].toInt(0), obj["height"].toInt(0));
// device framerate
_grabber.setFramerate(obj["fps"].toInt(15));
// CEC Standby
_grabber.setCecDetectionEnable(obj["cecDetection"].toBool(true));
_grabber.setSignalDetectionEnable(obj["signalDetection"].toBool(true));
_grabber.setSignalDetectionOffset(
obj["sDHOffsetMin"].toDouble(0.25),
obj["sDVOffsetMin"].toDouble(0.25),
obj["sDHOffsetMax"].toDouble(0.75),
obj["sDVOffsetMax"].toDouble(0.75));
_grabber.setSignalThreshold(
obj["redSignalThreshold"].toDouble(0.0)/100.0,
obj["greenSignalThreshold"].toDouble(0.0)/100.0,
obj["blueSignalThreshold"].toDouble(0.0)/100.0);
_grabber.setDeviceVideoStandard(
obj["device"].toString("auto"),
parseVideoStandard(obj["standard"].toString("no-change")));
}
}

View File

@@ -0,0 +1,33 @@
# Common cmake definition for external video grabber
# Add Turbo JPEG library
if (ENABLE_V4L2 OR ENABLE_MF)
find_package(TurboJPEG)
if (TURBOJPEG_FOUND)
add_definitions(-DHAVE_TURBO_JPEG)
message( STATUS "Using Turbo JPEG library: ${TurboJPEG_LIBRARY}")
include_directories(${TurboJPEG_INCLUDE_DIRS})
else ()
message( STATUS "Turbo JPEG library not found, MJPEG camera format won't work.")
endif ()
endif()
# Define the wrapper/header/source locations and collect them
SET(WRAPPER_DIR ${CMAKE_SOURCE_DIR}/libsrc/grabber/video)
SET(HEADER_DIR ${CMAKE_SOURCE_DIR}/include/grabber)
if (ENABLE_MF)
project(mf-grabber)
SET(CURRENT_SOURCE_DIR ${CMAKE_SOURCE_DIR}/libsrc/grabber/video/mediafoundation)
FILE (GLOB SOURCES "${WRAPPER_DIR}/*.cpp" "${HEADER_DIR}/Video*.h" "${HEADER_DIR}/MF*.h" "${HEADER_DIR}/Encoder*.h" "${CURRENT_SOURCE_DIR}/*.h" "${CURRENT_SOURCE_DIR}/*.cpp")
elseif(ENABLE_V4L2)
project(v4l2-grabber)
SET(CURRENT_SOURCE_DIR ${CMAKE_SOURCE_DIR}/libsrc/grabber/video/v4l2)
FILE (GLOB SOURCES "${WRAPPER_DIR}/*.cpp" "${HEADER_DIR}/Video*.h" "${HEADER_DIR}/V4L2*.h" "${HEADER_DIR}/Encoder*.h" "${CURRENT_SOURCE_DIR}/*.cpp")
endif()
add_library(${PROJECT_NAME} ${SOURCES})
target_link_libraries(${PROJECT_NAME} hyperion ${QT_LIBRARIES})
if(TURBOJPEG_FOUND)
target_link_libraries(${PROJECT_NAME} ${TurboJPEG_LIBRARY})
endif()

View File

@@ -0,0 +1,203 @@
#include "grabber/EncoderThread.h"
EncoderThread::EncoderThread()
: _localData(nullptr)
, _scalingFactorsCount(0)
, _imageResampler()
#ifdef HAVE_TURBO_JPEG
, _transform(nullptr)
, _decompress(nullptr)
, _scalingFactors(nullptr)
, _xform(nullptr)
#endif
{}
EncoderThread::~EncoderThread()
{
#ifdef HAVE_TURBO_JPEG
if (_transform)
tjDestroy(_transform);
if (_decompress)
tjDestroy(_decompress);
#endif
if (_localData)
#ifdef HAVE_TURBO_JPEG
tjFree(_localData);
#else
delete[] _localData;
#endif
}
void EncoderThread::setup(
PixelFormat pixelFormat, uint8_t* sharedData,
int size, int width, int height, int lineLength,
unsigned cropLeft, unsigned cropTop, unsigned cropBottom, unsigned cropRight,
VideoMode videoMode, FlipMode flipMode, int pixelDecimation)
{
_lineLength = lineLength;
_pixelFormat = pixelFormat;
_size = (unsigned long) size;
_width = width;
_height = height;
_cropLeft = cropLeft;
_cropTop = cropTop;
_cropBottom = cropBottom;
_cropRight = cropRight;
_flipMode = flipMode;
_pixelDecimation = pixelDecimation;
_imageResampler.setVideoMode(videoMode);
_imageResampler.setFlipMode(_flipMode);
_imageResampler.setCropping(cropLeft, cropRight, cropTop, cropBottom);
_imageResampler.setHorizontalPixelDecimation(_pixelDecimation);
_imageResampler.setVerticalPixelDecimation(_pixelDecimation);
#ifdef HAVE_TURBO_JPEG
if (_localData)
tjFree(_localData);
_localData = (uint8_t*)tjAlloc(size + 1);
#else
delete[] _localData;
_localData = nullptr;
_localData = new uint8_t(size + 1);
#endif
memcpy(_localData, sharedData, size);
}
void EncoderThread::process()
{
_busy = true;
if (_width > 0 && _height > 0)
{
#ifdef HAVE_TURBO_JPEG
if (_pixelFormat == PixelFormat::MJPEG)
{
processImageMjpeg();
}
else
#endif
{
if (_pixelFormat == PixelFormat::BGR24)
{
if (_flipMode == FlipMode::NO_CHANGE)
_imageResampler.setFlipMode(FlipMode::HORIZONTAL);
else if (_flipMode == FlipMode::HORIZONTAL)
_imageResampler.setFlipMode(FlipMode::NO_CHANGE);
else if (_flipMode == FlipMode::VERTICAL)
_imageResampler.setFlipMode(FlipMode::BOTH);
else if (_flipMode == FlipMode::BOTH)
_imageResampler.setFlipMode(FlipMode::VERTICAL);
}
Image<ColorRgb> image = Image<ColorRgb>();
_imageResampler.processImage(
_localData,
_width,
_height,
_lineLength,
#if defined(ENABLE_V4L2)
_pixelFormat,
#else
PixelFormat::BGR24,
#endif
image
);
emit newFrame(image);
}
}
_busy = false;
}
#ifdef HAVE_TURBO_JPEG
void EncoderThread::processImageMjpeg()
{
if (!_transform && _flipMode != FlipMode::NO_CHANGE)
{
_transform = tjInitTransform();
_xform = new tjtransform();
}
if (_flipMode == FlipMode::BOTH || _flipMode == FlipMode::HORIZONTAL)
{
_xform->op = TJXOP_HFLIP;
tjTransform(_transform, _localData, _size, 1, &_localData, &_size, _xform, TJFLAG_FASTDCT | TJFLAG_FASTUPSAMPLE);
}
if (_flipMode == FlipMode::BOTH || _flipMode == FlipMode::VERTICAL)
{
_xform->op = TJXOP_VFLIP;
tjTransform(_transform, _localData, _size, 1, &_localData, &_size, _xform, TJFLAG_FASTDCT | TJFLAG_FASTUPSAMPLE);
}
if (!_decompress)
{
_decompress = tjInitDecompress();
_scalingFactors = tjGetScalingFactors(&_scalingFactorsCount);
}
int subsamp = 0;
if (tjDecompressHeader2(_decompress, _localData, _size, &_width, &_height, &subsamp) != 0)
return;
int scaledWidth = _width, scaledHeight = _height;
if(_scalingFactors != nullptr && _pixelDecimation > 1)
{
for (int i = 0; i < _scalingFactorsCount ; i++)
{
const int tempWidth = TJSCALED(_width, _scalingFactors[i]);
const int tempHeight = TJSCALED(_height, _scalingFactors[i]);
if (tempWidth <= _width/_pixelDecimation && tempHeight <= _height/_pixelDecimation)
{
scaledWidth = tempWidth;
scaledHeight = tempHeight;
break;
}
}
if (scaledWidth == _width && scaledHeight == _height)
{
scaledWidth = TJSCALED(_width, _scalingFactors[_scalingFactorsCount-1]);
scaledHeight = TJSCALED(_height, _scalingFactors[_scalingFactorsCount-1]);
}
}
Image<ColorRgb> srcImage(scaledWidth, scaledHeight);
if (tjDecompress2(_decompress, _localData , _size, (unsigned char*)srcImage.memptr(), scaledWidth, 0, scaledHeight, TJPF_RGB, TJFLAG_FASTDCT | TJFLAG_FASTUPSAMPLE) != 0)
return;
// got image, process it
if (!(_cropLeft > 0 || _cropTop > 0 || _cropBottom > 0 || _cropRight > 0))
emit newFrame(srcImage);
else
{
// calculate the output size
int outputWidth = (_width - _cropLeft - _cropRight);
int outputHeight = (_height - _cropTop - _cropBottom);
if (outputWidth <= 0 || outputHeight <= 0)
return;
Image<ColorRgb> destImage(outputWidth, outputHeight);
for (unsigned int y = 0; y < destImage.height(); y++)
{
unsigned char* source = (unsigned char*)srcImage.memptr() + (y + _cropTop)*srcImage.width()*3 + _cropLeft*3;
unsigned char* dest = (unsigned char*)destImage.memptr() + y*destImage.width()*3;
memcpy(dest, source, destImage.width()*3);
free(source);
source = nullptr;
free(dest);
dest = nullptr;
}
// emit
emit newFrame(destImage);
}
}
#endif

View File

@@ -0,0 +1,149 @@
#include <QMetaType>
#include <grabber/VideoWrapper.h>
// qt includes
#include <QTimer>
VideoWrapper::VideoWrapper()
#if defined(ENABLE_V4L2)
: GrabberWrapper("V4L2", &_grabber)
#elif defined(ENABLE_MF)
: GrabberWrapper("V4L2:MEDIA_FOUNDATION", &_grabber)
#endif
, _grabber()
{
// register the image type
qRegisterMetaType<Image<ColorRgb>>("Image<ColorRgb>");
// Handle the image in the captured thread (Media Foundation/V4L2) using a direct connection
connect(&_grabber, SIGNAL(newFrame(const Image<ColorRgb>&)), this, SLOT(newFrame(const Image<ColorRgb>&)), Qt::DirectConnection);
connect(&_grabber, SIGNAL(readError(const char*)), this, SLOT(readError(const char*)), Qt::DirectConnection);
}
VideoWrapper::~VideoWrapper()
{
stop();
}
bool VideoWrapper::start()
{
return (_grabber.prepare() && _grabber.start() && GrabberWrapper::start());
}
void VideoWrapper::stop()
{
_grabber.stop();
GrabberWrapper::stop();
}
#if defined(ENABLE_CEC) && !defined(ENABLE_MF)
void VideoWrapper::handleCecEvent(CECEvent event)
{
_grabber.handleCecEvent(event);
}
#endif
void VideoWrapper::handleSettingsUpdate(settings::type type, const QJsonDocument& config)
{
if(type == settings::V4L2 && _grabberName.startsWith("V4L2"))
{
// extract settings
const QJsonObject& obj = config.object();
// set global grabber state
setV4lGrabberState(obj["enable"].toBool(false));
if (getV4lGrabberState())
{
#if defined(ENABLE_MF)
// Device path
_grabber.setDevice(obj["device"].toString("none"));
#endif
#if defined(ENABLE_V4L2)
// Device path and name
_grabber.setDevice(obj["device"].toString("none"), obj["available_devices"].toString("none"));
#endif
// Device input
_grabber.setInput(obj["input"].toInt(0));
// Device resolution
_grabber.setWidthHeight(obj["width"].toInt(0), obj["height"].toInt(0));
// Device framerate
_grabber.setFramerate(obj["fps"].toInt(15));
// Device encoding format
_grabber.setEncoding(obj["encoding"].toString("NO_CHANGE"));
// Video standard
_grabber.setVideoStandard(parseVideoStandard(obj["standard"].toString("NO_CHANGE")));
// Image size decimation
_grabber.setPixelDecimation(obj["sizeDecimation"].toInt(8));
// Flip mode
_grabber.setFlipMode(parseFlipMode(obj["flip"].toString("NO_CHANGE")));
// Image cropping
_grabber.setCropping(
obj["cropLeft"].toInt(0),
obj["cropRight"].toInt(0),
obj["cropTop"].toInt(0),
obj["cropBottom"].toInt(0));
// Brightness, Contrast, Saturation, Hue
_grabber.setBrightnessContrastSaturationHue(
obj["hardware_brightness"].toInt(0),
obj["hardware_contrast"].toInt(0),
obj["hardware_saturation"].toInt(0),
obj["hardware_hue"].toInt(0));
#if defined(ENABLE_CEC) && defined(ENABLE_V4L2)
// CEC Standby
_grabber.setCecDetectionEnable(obj["cecDetection"].toBool(true));
#endif
// Software frame skipping
_grabber.setFpsSoftwareDecimation(obj["fpsSoftwareDecimation"].toInt(1));
// Signal detection
_grabber.setSignalDetectionEnable(obj["signalDetection"].toBool(true));
_grabber.setSignalDetectionOffset(
obj["sDHOffsetMin"].toDouble(0.25),
obj["sDVOffsetMin"].toDouble(0.25),
obj["sDHOffsetMax"].toDouble(0.75),
obj["sDVOffsetMax"].toDouble(0.75));
_grabber.setSignalThreshold(
obj["redSignalThreshold"].toDouble(0.0)/100.0,
obj["greenSignalThreshold"].toDouble(0.0)/100.0,
obj["blueSignalThreshold"].toDouble(0.0)/100.0,
obj["noSignalCounterThreshold"].toInt(50));
// Reload the Grabber if any settings have been changed that require it
_grabber.reload(getV4lGrabberState());
}
else
stop();
}
}
void VideoWrapper::newFrame(const Image<ColorRgb> &image)
{
emit systemImage(_grabberName, image);
}
void VideoWrapper::readError(const char* err)
{
Error(_log, "Stop grabber, because reading device failed. (%s)", err);
stop();
}
void VideoWrapper::action()
{
// dummy as v4l get notifications from stream
}

View File

@@ -0,0 +1,813 @@
#include "MFSourceReaderCB.h"
#include "grabber/MFGrabber.h"
// Constants
namespace { const bool verbose = false; }
// Need more video properties? Visit https://docs.microsoft.com/en-us/windows/win32/api/strmif/ne-strmif-videoprocampproperty
using VideoProcAmpPropertyMap = QMap<VideoProcAmpProperty, QString>;
inline QMap<VideoProcAmpProperty, QString> initVideoProcAmpPropertyMap()
{
QMap<VideoProcAmpProperty, QString> propertyMap
{
{VideoProcAmp_Brightness, "brightness" },
{VideoProcAmp_Contrast , "contrast" },
{VideoProcAmp_Saturation, "saturation" },
{VideoProcAmp_Hue , "hue" }
};
return propertyMap;
};
Q_GLOBAL_STATIC_WITH_ARGS(VideoProcAmpPropertyMap, _videoProcAmpPropertyMap, (initVideoProcAmpPropertyMap()));
MFGrabber::MFGrabber()
: Grabber("V4L2:MEDIA_FOUNDATION")
, _currentDeviceName("none")
, _newDeviceName("none")
, _hr(S_FALSE)
, _sourceReader(nullptr)
, _sourceReaderCB(nullptr)
, _threadManager(nullptr)
, _pixelFormat(PixelFormat::NO_CHANGE)
, _pixelFormatConfig(PixelFormat::NO_CHANGE)
, _lineLength(-1)
, _frameByteSize(-1)
, _noSignalCounterThreshold(40)
, _noSignalCounter(0)
, _brightness(0)
, _contrast(0)
, _saturation(0)
, _hue(0)
, _currentFrame(0)
, _noSignalThresholdColor(ColorRgb{0,0,0})
, _signalDetectionEnabled(true)
, _noSignalDetected(false)
, _initialized(false)
, _reload(false)
, _x_frac_min(0.25)
, _y_frac_min(0.25)
, _x_frac_max(0.75)
, _y_frac_max(0.75)
{
CoInitializeEx(0, COINIT_MULTITHREADED);
_hr = MFStartup(MF_VERSION, MFSTARTUP_NOSOCKET);
if (FAILED(_hr))
CoUninitialize();
}
MFGrabber::~MFGrabber()
{
uninit();
SAFE_RELEASE(_sourceReader);
if (_sourceReaderCB != nullptr)
while (_sourceReaderCB->isBusy()) {}
SAFE_RELEASE(_sourceReaderCB);
if (_threadManager)
delete _threadManager;
_threadManager = nullptr;
if (SUCCEEDED(_hr) && SUCCEEDED(MFShutdown()))
CoUninitialize();
}
bool MFGrabber::prepare()
{
if (SUCCEEDED(_hr))
{
if (!_sourceReaderCB)
_sourceReaderCB = new SourceReaderCB(this);
if (!_threadManager)
_threadManager = new EncoderThreadManager(this);
return (_sourceReaderCB != nullptr && _threadManager != nullptr);
}
return false;
}
bool MFGrabber::start()
{
if (!_initialized)
{
if (init())
{
connect(_threadManager, &EncoderThreadManager::newFrame, this, &MFGrabber::newThreadFrame);
_threadManager->start();
DebugIf(verbose, _log, "Decoding threads: %d", _threadManager->_threadCount);
start_capturing();
Info(_log, "Started");
return true;
}
else
{
Error(_log, "The Media Foundation Grabber could not be started");
return false;
}
}
else
return true;
}
void MFGrabber::stop()
{
if (_initialized)
{
_initialized = false;
_threadManager->stop();
disconnect(_threadManager, nullptr, nullptr, nullptr);
_sourceReader->Flush(MF_SOURCE_READER_FIRST_VIDEO_STREAM);
SAFE_RELEASE(_sourceReader);
_deviceProperties.clear();
_deviceControls.clear();
Info(_log, "Stopped");
}
}
bool MFGrabber::init()
{
// enumerate the video capture devices on the user's system
enumVideoCaptureDevices();
if (!_initialized && SUCCEEDED(_hr))
{
int deviceIndex = -1;
bool noDeviceName = _currentDeviceName.compare("none", Qt::CaseInsensitive) == 0 || _currentDeviceName.compare("auto", Qt::CaseInsensitive) == 0;
if (noDeviceName)
return false;
if (!_deviceProperties.contains(_currentDeviceName))
{
Debug(_log, "Configured device '%s' is not available.", QSTRING_CSTR(_currentDeviceName));
return false;
}
Debug(_log, "Searching for %s %d x %d @ %d fps (%s)", QSTRING_CSTR(_currentDeviceName), _width, _height,_fps, QSTRING_CSTR(pixelFormatToString(_pixelFormat)));
QList<DeviceProperties> dev = _deviceProperties[_currentDeviceName];
for ( int i = 0; i < dev.count() && deviceIndex < 0; ++i )
{
if (dev[i].width != _width || dev[i].height != _height || dev[i].fps != _fps || dev[i].pf != _pixelFormat)
continue;
else
deviceIndex = i;
}
if (deviceIndex >= 0 && SUCCEEDED(init_device(_currentDeviceName, dev[deviceIndex])))
{
_initialized = true;
_newDeviceName = _currentDeviceName;
}
else
{
Debug(_log, "Configured device '%s' is not available.", QSTRING_CSTR(_currentDeviceName));
return false;
}
}
return _initialized;
}
void MFGrabber::uninit()
{
// stop if the grabber was not stopped
if (_initialized)
{
Debug(_log,"Uninit grabber: %s", QSTRING_CSTR(_newDeviceName));
stop();
}
}
HRESULT MFGrabber::init_device(QString deviceName, DeviceProperties props)
{
PixelFormat pixelformat = GetPixelFormatForGuid(props.guid);
QString error;
IMFMediaSource* device = nullptr;
IMFAttributes* deviceAttributes = nullptr, *sourceReaderAttributes = nullptr;
IMFMediaType* type = nullptr;
HRESULT hr = S_OK;
Debug(_log, "Init %s, %d x %d @ %d fps (%s)", QSTRING_CSTR(deviceName), props.width, props.height, props.fps, QSTRING_CSTR(pixelFormatToString(pixelformat)));
DebugIf (verbose, _log, "Symbolic link: %s", QSTRING_CSTR(props.symlink));
hr = MFCreateAttributes(&deviceAttributes, 2);
if (FAILED(hr))
{
error = QString("Could not create device attributes (%1)").arg(hr);
goto done;
}
hr = deviceAttributes->SetGUID(MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE, MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_GUID);
if (FAILED(hr))
{
error = QString("SetGUID_MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE (%1)").arg(hr);
goto done;
}
if (FAILED(deviceAttributes->SetString(MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_SYMBOLIC_LINK, (LPCWSTR)props.symlink.utf16())))
{
error = QString("IMFAttributes_SetString_MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_SYMBOLIC_LINK (%1)").arg(hr);
goto done;
}
hr = MFCreateDeviceSource(deviceAttributes, &device);
if (FAILED(hr))
{
error = QString("MFCreateDeviceSource (%1)").arg(hr);
goto done;
}
if (!device)
{
error = QString("Could not open device (%1)").arg(hr);
goto done;
}
else
Debug(_log, "Device opened");
IAMVideoProcAmp *pProcAmp = nullptr;
if (SUCCEEDED(device->QueryInterface(IID_PPV_ARGS(&pProcAmp))))
{
for (auto control : _deviceControls[deviceName])
{
switch (_videoProcAmpPropertyMap->key(control.property))
{
case VideoProcAmpProperty::VideoProcAmp_Brightness:
if (_brightness >= control.minValue && _brightness <= control.maxValue && _brightness != control.currentValue)
{
Debug(_log,"Set brightness to %i", _brightness);
pProcAmp->Set(VideoProcAmp_Brightness, _brightness, VideoProcAmp_Flags_Manual);
}
break;
case VideoProcAmpProperty::VideoProcAmp_Contrast:
if (_contrast >= control.minValue && _contrast <= control.maxValue && _contrast != control.currentValue)
{
Debug(_log,"Set contrast to %i", _contrast);
pProcAmp->Set(VideoProcAmp_Contrast, _contrast, VideoProcAmp_Flags_Manual);
}
break;
case VideoProcAmpProperty::VideoProcAmp_Saturation:
if (_saturation >= control.minValue && _saturation <= control.maxValue && _saturation != control.currentValue)
{
Debug(_log,"Set saturation to %i", _saturation);
pProcAmp->Set(VideoProcAmp_Saturation, _saturation, VideoProcAmp_Flags_Manual);
}
break;
case VideoProcAmpProperty::VideoProcAmp_Hue:
if (_hue >= control.minValue && _hue <= control.maxValue && _hue != control.currentValue)
{
Debug(_log,"Set hue to %i", _hue);
pProcAmp->Set(VideoProcAmp_Hue, _hue, VideoProcAmp_Flags_Manual);
}
break;
default:
break;
}
}
}
hr = MFCreateAttributes(&sourceReaderAttributes, 1);
if (FAILED(hr))
{
error = QString("Could not create Source Reader attributes (%1)").arg(hr);
goto done;
}
hr = sourceReaderAttributes->SetUnknown(MF_SOURCE_READER_ASYNC_CALLBACK, (IMFSourceReaderCallback *)_sourceReaderCB);
if (FAILED(hr))
{
error = QString("Could not set stream parameter: SetUnknown_MF_SOURCE_READER_ASYNC_CALLBACK (%1)").arg(hr);
hr = E_INVALIDARG;
goto done;
}
hr = MFCreateSourceReaderFromMediaSource(device, sourceReaderAttributes, &_sourceReader);
if (FAILED(hr))
{
error = QString("Could not create the Source Reader (%1)").arg(hr);
goto done;
}
hr = MFCreateMediaType(&type);
if (FAILED(hr))
{
error = QString("Could not create an empty media type (%1)").arg(hr);
goto done;
}
hr = type->SetGUID(MF_MT_MAJOR_TYPE, MFMediaType_Video);
if (FAILED(hr))
{
error = QString("Could not set stream parameter: SetGUID_MF_MT_MAJOR_TYPE (%1)").arg(hr);
goto done;
}
hr = type->SetGUID(MF_MT_SUBTYPE, props.guid);
if (FAILED(hr))
{
error = QString("Could not set stream parameter: SetGUID_MF_MT_SUBTYPE (%1)").arg(hr);
goto done;
}
hr = MFSetAttributeSize(type, MF_MT_FRAME_SIZE, props.width, props.height);
if (FAILED(hr))
{
error = QString("Could not set stream parameter: SMFSetAttributeSize_MF_MT_FRAME_SIZE (%1)").arg(hr);
goto done;
}
hr = MFSetAttributeSize(type, MF_MT_FRAME_RATE, props.numerator, props.denominator);
if (FAILED(hr))
{
error = QString("Could not set stream parameter: MFSetAttributeSize_MF_MT_FRAME_RATE (%1)").arg(hr);
goto done;
}
hr = MFSetAttributeRatio(type, MF_MT_PIXEL_ASPECT_RATIO, 1, 1);
if (FAILED(hr))
{
error = QString("Could not set stream parameter: MFSetAttributeRatio_MF_MT_PIXEL_ASPECT_RATIO (%1)").arg(hr);
goto done;
}
hr = _sourceReaderCB->InitializeVideoEncoder(type, pixelformat);
if (FAILED(hr))
{
error = QString("Failed to initialize the Video Encoder (%1)").arg(hr);
goto done;
}
hr = _sourceReader->SetCurrentMediaType(MF_SOURCE_READER_FIRST_VIDEO_STREAM, nullptr, type);
if (FAILED(hr))
{
error = QString("Failed to set media type on Source Reader (%1)").arg(hr);
}
done:
if (FAILED(hr))
{
emit readError(QSTRING_CSTR(error));
SAFE_RELEASE(_sourceReader);
}
else
{
_pixelFormat = props.pf;
_width = props.width;
_height = props.height;
_frameByteSize = _width * _height * 3;
_lineLength = _width * 3;
}
// Cleanup
SAFE_RELEASE(deviceAttributes);
SAFE_RELEASE(device);
SAFE_RELEASE(pProcAmp);
SAFE_RELEASE(type);
SAFE_RELEASE(sourceReaderAttributes);
return hr;
}
void MFGrabber::enumVideoCaptureDevices()
{
_deviceProperties.clear();
_deviceControls.clear();
IMFAttributes* attr;
if (SUCCEEDED(MFCreateAttributes(&attr, 1)))
{
if (SUCCEEDED(attr->SetGUID(MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE, MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_GUID)))
{
UINT32 count;
IMFActivate** devices;
if (SUCCEEDED(MFEnumDeviceSources(attr, &devices, &count)))
{
DebugIf (verbose, _log, "Detected devices: %u", count);
for (UINT32 i = 0; i < count; i++)
{
UINT32 length;
LPWSTR name;
LPWSTR symlink;
if (SUCCEEDED(devices[i]->GetAllocatedString(MF_DEVSOURCE_ATTRIBUTE_FRIENDLY_NAME, &name, &length)))
{
if (SUCCEEDED(devices[i]->GetAllocatedString(MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_SYMBOLIC_LINK, &symlink, &length)))
{
QList<DeviceProperties> devicePropertyList;
QString dev = QString::fromUtf16((const ushort*)name);
IMFMediaSource *pSource = nullptr;
if (SUCCEEDED(devices[i]->ActivateObject(IID_PPV_ARGS(&pSource))))
{
DebugIf (verbose, _log, "Found capture device: %s", QSTRING_CSTR(dev));
IMFMediaType *pType = nullptr;
IMFSourceReader* reader;
if (SUCCEEDED(MFCreateSourceReaderFromMediaSource(pSource, NULL, &reader)))
{
for (DWORD i = 0; ; i++)
{
if (FAILED(reader->GetNativeMediaType((DWORD)MF_SOURCE_READER_FIRST_VIDEO_STREAM, i, &pType)))
break;
GUID format;
UINT32 width = 0, height = 0, numerator = 0, denominator = 0;
if ( SUCCEEDED(pType->GetGUID(MF_MT_SUBTYPE, &format)) &&
SUCCEEDED(MFGetAttributeSize(pType, MF_MT_FRAME_SIZE, &width, &height)) &&
SUCCEEDED(MFGetAttributeRatio(pType, MF_MT_FRAME_RATE, &numerator, &denominator)))
{
PixelFormat pixelformat = GetPixelFormatForGuid(format);
if (pixelformat != PixelFormat::NO_CHANGE)
{
DeviceProperties properties;
properties.symlink = QString::fromUtf16((const ushort*)symlink);
properties.width = width;
properties.height = height;
properties.fps = numerator / denominator;
properties.numerator = numerator;
properties.denominator = denominator;
properties.pf = pixelformat;
properties.guid = format;
devicePropertyList.append(properties);
DebugIf (verbose, _log, "%s %d x %d @ %d fps (%s)", QSTRING_CSTR(dev), properties.width, properties.height, properties.fps, QSTRING_CSTR(pixelFormatToString(properties.pf)));
}
}
SAFE_RELEASE(pType);
}
IAMVideoProcAmp *videoProcAmp = nullptr;
if (SUCCEEDED(pSource->QueryInterface(IID_PPV_ARGS(&videoProcAmp))))
{
QList<DeviceControls> deviceControlList;
for (auto it = _videoProcAmpPropertyMap->begin(); it != _videoProcAmpPropertyMap->end(); it++)
{
long minVal, maxVal, stepVal, defaultVal, flag;
if (SUCCEEDED(videoProcAmp->GetRange(it.key(), &minVal, &maxVal, &stepVal, &defaultVal, &flag)))
{
if (flag & VideoProcAmp_Flags_Manual)
{
DeviceControls control;
control.property = it.value();
control.minValue = minVal;
control.maxValue = maxVal;
control.step = stepVal;
control.default = defaultVal;
long currentVal;
if (SUCCEEDED(videoProcAmp->Get(it.key(), &currentVal, &flag)))
{
control.currentValue = currentVal;
DebugIf(verbose, _log, "%s: min=%i, max=%i, step=%i, default=%i, current=%i", QSTRING_CSTR(it.value()), minVal, maxVal, stepVal, defaultVal, currentVal);
}
else
break;
deviceControlList.append(control);
}
}
}
if (!deviceControlList.isEmpty())
_deviceControls.insert(dev, deviceControlList);
}
SAFE_RELEASE(videoProcAmp);
SAFE_RELEASE(reader);
}
SAFE_RELEASE(pSource);
}
if (!devicePropertyList.isEmpty())
_deviceProperties.insert(dev, devicePropertyList);
}
CoTaskMemFree(symlink);
}
CoTaskMemFree(name);
SAFE_RELEASE(devices[i]);
}
CoTaskMemFree(devices);
}
SAFE_RELEASE(attr);
}
}
}
void MFGrabber::start_capturing()
{
if (_initialized && _sourceReader && _threadManager)
{
HRESULT hr = _sourceReader->ReadSample(MF_SOURCE_READER_FIRST_VIDEO_STREAM, 0, NULL, NULL, NULL, NULL);
if (!SUCCEEDED(hr))
Error(_log, "ReadSample (%i)", hr);
}
}
void MFGrabber::process_image(const void *frameImageBuffer, int size)
{
int processFrameIndex = _currentFrame++;
// frame skipping
if ((processFrameIndex % (_fpsSoftwareDecimation + 1) != 0) && (_fpsSoftwareDecimation > 0))
return;
// We do want a new frame...
if (size < _frameByteSize && _pixelFormat != PixelFormat::MJPEG)
Error(_log, "Frame too small: %d != %d", size, _frameByteSize);
else if (_threadManager != nullptr)
{
for (int i = 0; i < _threadManager->_threadCount; i++)
{
if (!_threadManager->_threads[i]->isBusy())
{
_threadManager->_threads[i]->setup(_pixelFormat, (uint8_t*)frameImageBuffer, size, _width, _height, _lineLength, _cropLeft, _cropTop, _cropBottom, _cropRight, _videoMode, _flipMode, _pixelDecimation);
_threadManager->_threads[i]->process();
break;
}
}
}
}
void MFGrabber::receive_image(const void *frameImageBuffer, int size)
{
process_image(frameImageBuffer, size);
start_capturing();
}
void MFGrabber::newThreadFrame(Image<ColorRgb> image)
{
if (_signalDetectionEnabled)
{
// check signal (only in center of the resulting image, because some grabbers have noise values along the borders)
bool noSignal = true;
// top left
unsigned xOffset = image.width() * _x_frac_min;
unsigned yOffset = image.height() * _y_frac_min;
// bottom right
unsigned xMax = image.width() * _x_frac_max;
unsigned yMax = image.height() * _y_frac_max;
for (unsigned x = xOffset; noSignal && x < xMax; ++x)
for (unsigned y = yOffset; noSignal && y < yMax; ++y)
noSignal &= (ColorRgb&)image(x, y) <= _noSignalThresholdColor;
if (noSignal)
++_noSignalCounter;
else
{
if (_noSignalCounter >= _noSignalCounterThreshold)
{
_noSignalDetected = true;
Info(_log, "Signal detected");
}
_noSignalCounter = 0;
}
if ( _noSignalCounter < _noSignalCounterThreshold)
{
emit newFrame(image);
}
else if (_noSignalCounter == _noSignalCounterThreshold)
{
_noSignalDetected = false;
Info(_log, "Signal lost");
}
}
else
emit newFrame(image);
}
void MFGrabber::setDevice(const QString& device)
{
if (_currentDeviceName != device)
{
_currentDeviceName = device;
_reload = true;
}
}
bool MFGrabber::setInput(int input)
{
if (Grabber::setInput(input))
{
_reload = true;
return true;
}
return false;
}
bool MFGrabber::setWidthHeight(int width, int height)
{
if (Grabber::setWidthHeight(width, height))
{
_reload = true;
return true;
}
return false;
}
void MFGrabber::setEncoding(QString enc)
{
if (_pixelFormatConfig != parsePixelFormat(enc))
{
_pixelFormatConfig = parsePixelFormat(enc);
if (_initialized)
{
Debug(_log,"Set hardware encoding to: %s", QSTRING_CSTR(enc.toUpper()));
_reload = true;
}
else
_pixelFormat = _pixelFormatConfig;
}
}
void MFGrabber::setBrightnessContrastSaturationHue(int brightness, int contrast, int saturation, int hue)
{
if (_brightness != brightness || _contrast != contrast || _saturation != saturation || _hue != hue)
{
_brightness = brightness;
_contrast = contrast;
_saturation = saturation;
_hue = hue;
_reload = true;
}
}
void MFGrabber::setSignalThreshold(double redSignalThreshold, double greenSignalThreshold, double blueSignalThreshold, int noSignalCounterThreshold)
{
_noSignalThresholdColor.red = uint8_t(255*redSignalThreshold);
_noSignalThresholdColor.green = uint8_t(255*greenSignalThreshold);
_noSignalThresholdColor.blue = uint8_t(255*blueSignalThreshold);
_noSignalCounterThreshold = qMax(1, noSignalCounterThreshold);
if (_signalDetectionEnabled)
Info(_log, "Signal threshold set to: {%d, %d, %d} and frames: %d", _noSignalThresholdColor.red, _noSignalThresholdColor.green, _noSignalThresholdColor.blue, _noSignalCounterThreshold );
}
void MFGrabber::setSignalDetectionOffset(double horizontalMin, double verticalMin, double horizontalMax, double verticalMax)
{
// rainbow 16 stripes 0.47 0.2 0.49 0.8
// unicolor: 0.25 0.25 0.75 0.75
_x_frac_min = horizontalMin;
_y_frac_min = verticalMin;
_x_frac_max = horizontalMax;
_y_frac_max = verticalMax;
if (_signalDetectionEnabled)
Info(_log, "Signal detection area set to: %f,%f x %f,%f", _x_frac_min, _y_frac_min, _x_frac_max, _y_frac_max );
}
void MFGrabber::setSignalDetectionEnable(bool enable)
{
if (_signalDetectionEnabled != enable)
{
_signalDetectionEnabled = enable;
if (_initialized)
Info(_log, "Signal detection is now %s", enable ? "enabled" : "disabled");
}
}
bool MFGrabber::reload(bool force)
{
if (_reload || force)
{
if (_sourceReader)
{
Info(_log,"Reloading Media Foundation Grabber");
uninit();
_pixelFormat = _pixelFormatConfig;
_newDeviceName = _currentDeviceName;
}
_reload = false;
return prepare() && start();
}
return false;
}
QJsonArray MFGrabber::discover(const QJsonObject& params)
{
DebugIf (verbose, _log, "params: [%s]", QString(QJsonDocument(params).toJson(QJsonDocument::Compact)).toUtf8().constData());
enumVideoCaptureDevices();
QJsonArray inputsDiscovered;
for (auto it = _deviceProperties.begin(); it != _deviceProperties.end(); ++it)
{
QJsonObject device, in;
QJsonArray video_inputs, formats;
device["device"] = it.key();
device["device_name"] = it.key();
device["type"] = "v4l2";
in["name"] = "";
in["inputIdx"] = 0;
QStringList encodingFormats = QStringList();
for (int i = 0; i < _deviceProperties[it.key()].count(); ++i )
if (!encodingFormats.contains(pixelFormatToString(_deviceProperties[it.key()][i].pf), Qt::CaseInsensitive))
encodingFormats << pixelFormatToString(_deviceProperties[it.key()][i].pf).toLower();
for (auto encodingFormat : encodingFormats)
{
QJsonObject format;
QJsonArray resolutionArray;
format["format"] = encodingFormat;
QMultiMap<int, int> deviceResolutions = QMultiMap<int, int>();
for (int i = 0; i < _deviceProperties[it.key()].count(); ++i )
if (!deviceResolutions.contains(_deviceProperties[it.key()][i].width, _deviceProperties[it.key()][i].height) && _deviceProperties[it.key()][i].pf == parsePixelFormat(encodingFormat))
deviceResolutions.insert(_deviceProperties[it.key()][i].width, _deviceProperties[it.key()][i].height);
for (auto width_height = deviceResolutions.begin(); width_height != deviceResolutions.end(); width_height++)
{
QJsonObject resolution;
QJsonArray fps;
resolution["width"] = width_height.key();
resolution["height"] = width_height.value();
QIntList framerates = QIntList();
for (int i = 0; i < _deviceProperties[it.key()].count(); ++i )
{
int fps = _deviceProperties[it.key()][i].numerator / _deviceProperties[it.key()][i].denominator;
if (!framerates.contains(fps) && _deviceProperties[it.key()][i].pf == parsePixelFormat(encodingFormat) && _deviceProperties[it.key()][i].width == width_height.key() && _deviceProperties[it.key()][i].height == width_height.value())
framerates << fps;
}
for (auto framerate : framerates)
fps.append(framerate);
resolution["fps"] = fps;
resolutionArray.append(resolution);
}
format["resolutions"] = resolutionArray;
formats.append(format);
}
in["formats"] = formats;
video_inputs.append(in);
device["video_inputs"] = video_inputs;
QJsonObject controls, controls_default;
for (auto control : _deviceControls[it.key()])
{
QJsonObject property;
property["minValue"] = control.minValue;
property["maxValue"] = control.maxValue;
property["step"] = control.step;
property["current"] = control.currentValue;
controls[control.property] = property;
controls_default[control.property] = control.default;
}
device["properties"] = controls;
QJsonObject defaults, video_inputs_default, format_default, resolution_default;
resolution_default["width"] = 640;
resolution_default["height"] = 480;
resolution_default["fps"] = 25;
format_default["format"] = "bgr24";
format_default["resolution"] = resolution_default;
video_inputs_default["inputIdx"] = 0;
video_inputs_default["standards"] = "PAL";
video_inputs_default["formats"] = format_default;
defaults["video_input"] = video_inputs_default;
defaults["properties"] = controls_default;
device["default"] = defaults;
inputsDiscovered.append(device);
}
_deviceProperties.clear();
_deviceControls.clear();
DebugIf (verbose, _log, "device: [%s]", QString(QJsonDocument(inputsDiscovered).toJson(QJsonDocument::Compact)).toUtf8().constData());
return inputsDiscovered;
}

View File

@@ -0,0 +1,401 @@
#pragma once
#include <mfapi.h>
#include <mftransform.h>
#include <dmo.h>
#include <wmcodecdsp.h>
#include <mfidl.h>
#include <mfreadwrite.h>
#include <shlwapi.h>
#include <mferror.h>
#include <strmif.h>
#include <comdef.h>
#pragma comment (lib, "ole32.lib")
#pragma comment (lib, "mf.lib")
#pragma comment (lib, "mfplat.lib")
#pragma comment (lib, "mfuuid.lib")
#pragma comment (lib, "mfreadwrite.lib")
#pragma comment (lib, "strmiids.lib")
#pragma comment (lib, "wmcodecdspuuid.lib")
#include <grabber/MFGrabber.h>
#define SAFE_RELEASE(x) if(x) { x->Release(); x = nullptr; }
// Need more supported formats? Visit https://docs.microsoft.com/en-us/windows/win32/medfound/colorconverter
static PixelFormat GetPixelFormatForGuid(const GUID guid)
{
if (IsEqualGUID(guid, MFVideoFormat_RGB32)) return PixelFormat::RGB32;
if (IsEqualGUID(guid, MFVideoFormat_RGB24)) return PixelFormat::BGR24;
if (IsEqualGUID(guid, MFVideoFormat_YUY2)) return PixelFormat::YUYV;
if (IsEqualGUID(guid, MFVideoFormat_UYVY)) return PixelFormat::UYVY;
if (IsEqualGUID(guid, MFVideoFormat_MJPG)) return PixelFormat::MJPEG;
if (IsEqualGUID(guid, MFVideoFormat_NV12)) return PixelFormat::NV12;
if (IsEqualGUID(guid, MFVideoFormat_I420)) return PixelFormat::I420;
return PixelFormat::NO_CHANGE;
};
class SourceReaderCB : public IMFSourceReaderCallback
{
public:
SourceReaderCB(MFGrabber* grabber)
: _nRefCount(1)
, _grabber(grabber)
, _bEOS(FALSE)
, _hrStatus(S_OK)
, _isBusy(false)
, _transform(nullptr)
, _pixelformat(PixelFormat::NO_CHANGE)
{
// Initialize critical section.
InitializeCriticalSection(&_critsec);
}
// IUnknown methods
STDMETHODIMP QueryInterface(REFIID iid, void** ppv)
{
static const QITAB qit[] =
{
QITABENT(SourceReaderCB, IMFSourceReaderCallback),
{ 0 },
};
return QISearch(this, qit, iid, ppv);
}
STDMETHODIMP_(ULONG) AddRef()
{
return InterlockedIncrement(&_nRefCount);
}
STDMETHODIMP_(ULONG) Release()
{
ULONG uCount = InterlockedDecrement(&_nRefCount);
if (uCount == 0)
{
delete this;
}
return uCount;
}
// IMFSourceReaderCallback methods
STDMETHODIMP OnReadSample(HRESULT hrStatus, DWORD /*dwStreamIndex*/,
DWORD dwStreamFlags, LONGLONG llTimestamp, IMFSample* pSample)
{
EnterCriticalSection(&_critsec);
_isBusy = true;
if (_grabber->_sourceReader == nullptr)
{
_isBusy = false;
LeaveCriticalSection(&_critsec);
return S_OK;
}
if (dwStreamFlags & MF_SOURCE_READERF_STREAMTICK)
{
Debug(_grabber->_log, "Skipping stream gap");
LeaveCriticalSection(&_critsec);
_grabber->_sourceReader->ReadSample(MF_SOURCE_READER_FIRST_VIDEO_STREAM, 0, nullptr, nullptr, nullptr, nullptr);
return S_OK;
}
if (dwStreamFlags & MF_SOURCE_READERF_NATIVEMEDIATYPECHANGED)
{
IMFMediaType* type = nullptr;
GUID format;
_grabber->_sourceReader->GetNativeMediaType(MF_SOURCE_READER_FIRST_VIDEO_STREAM, MF_SOURCE_READER_CURRENT_TYPE_INDEX, &type);
type->GetGUID(MF_MT_SUBTYPE, &format);
Debug(_grabber->_log, "Native media type changed");
InitializeVideoEncoder(type, GetPixelFormatForGuid(format));
SAFE_RELEASE(type);
}
if (dwStreamFlags & MF_SOURCE_READERF_CURRENTMEDIATYPECHANGED)
{
IMFMediaType* type = nullptr;
GUID format;
_grabber->_sourceReader->GetCurrentMediaType(MF_SOURCE_READER_FIRST_VIDEO_STREAM, &type);
type->GetGUID(MF_MT_SUBTYPE, &format);
Debug(_grabber->_log, "Current media type changed");
InitializeVideoEncoder(type, GetPixelFormatForGuid(format));
SAFE_RELEASE(type);
}
// Variables declaration
IMFMediaBuffer* buffer = nullptr;
if (FAILED(hrStatus))
{
_hrStatus = hrStatus;
_com_error error(_hrStatus);
Error(_grabber->_log, "%s", error.ErrorMessage());
goto done;
}
if (!pSample)
{
Error(_grabber->_log, "Media sample is empty");
goto done;
}
if (_pixelformat != PixelFormat::MJPEG && _pixelformat != PixelFormat::BGR24 && _pixelformat != PixelFormat::NO_CHANGE)
pSample = TransformSample(_transform, pSample);
_hrStatus = pSample->ConvertToContiguousBuffer(&buffer);
if (FAILED(_hrStatus))
{
_com_error error(_hrStatus);
Error(_grabber->_log, "Buffer conversion failed => %s", error.ErrorMessage());
goto done;
}
BYTE* data = nullptr;
DWORD maxLength = 0, currentLength = 0;
_hrStatus = buffer->Lock(&data, &maxLength, &currentLength);
if (FAILED(_hrStatus))
{
_com_error error(_hrStatus);
Error(_grabber->_log, "Access to the buffer memory failed => %s", error.ErrorMessage());
goto done;
}
_grabber->receive_image(data, currentLength);
_hrStatus = buffer->Unlock();
if (FAILED(_hrStatus))
{
_com_error error(_hrStatus);
Error(_grabber->_log, "Unlocking the buffer memory failed => %s", error.ErrorMessage());
}
done:
SAFE_RELEASE(buffer);
if (MF_SOURCE_READERF_ENDOFSTREAM & dwStreamFlags)
_bEOS = TRUE; // Reached the end of the stream.
if (_pixelformat != PixelFormat::MJPEG && _pixelformat != PixelFormat::BGR24 && _pixelformat != PixelFormat::NO_CHANGE)
SAFE_RELEASE(pSample);
_isBusy = false;
LeaveCriticalSection(&_critsec);
return _hrStatus;
}
HRESULT SourceReaderCB::InitializeVideoEncoder(IMFMediaType* type, PixelFormat format)
{
_pixelformat = format;
if (format == PixelFormat::MJPEG || format == PixelFormat::BGR24 || format == PixelFormat::NO_CHANGE)
return S_OK;
// Variable declaration
IMFMediaType* output = nullptr;
DWORD mftStatus = 0;
QString error = "";
// Create instance of IMFTransform interface pointer as CColorConvertDMO
_hrStatus = CoCreateInstance(CLSID_CColorConvertDMO, nullptr, CLSCTX_INPROC_SERVER, IID_IMFTransform, (void**)&_transform);
if (FAILED(_hrStatus))
{
_com_error error(_hrStatus);
Error(_grabber->_log, "Creation of the Color Converter failed => %s", error.ErrorMessage());
goto done;
}
// Set input type as media type of our input stream
_hrStatus = _transform->SetInputType(0, type, 0);
if (FAILED(_hrStatus))
{
_com_error error(_hrStatus);
Error(_grabber->_log, "Setting the input media type failed => %s", error.ErrorMessage());
goto done;
}
// Create new media type
_hrStatus = MFCreateMediaType(&output);
if (FAILED(_hrStatus))
{
_com_error error(_hrStatus);
Error(_grabber->_log, "Creating a new media type failed => %s", error.ErrorMessage());
goto done;
}
// Copy all attributes from input type to output media type
_hrStatus = type->CopyAllItems(output);
if (FAILED(_hrStatus))
{
_com_error error(_hrStatus);
Error(_grabber->_log, "Copying of all attributes from input to output media type failed => %s", error.ErrorMessage());
goto done;
}
UINT32 width, height;
UINT32 numerator, denominator;
// Fill the missing attributes
if (FAILED(output->SetGUID(MF_MT_MAJOR_TYPE, MFMediaType_Video)) ||
FAILED(output->SetGUID(MF_MT_SUBTYPE, MFVideoFormat_RGB24)) ||
FAILED(output->SetUINT32(MF_MT_FIXED_SIZE_SAMPLES, TRUE)) ||
FAILED(output->SetUINT32(MF_MT_ALL_SAMPLES_INDEPENDENT, TRUE)) ||
FAILED(output->SetUINT32(MF_MT_INTERLACE_MODE, MFVideoInterlace_Progressive)) ||
FAILED(MFGetAttributeSize(type, MF_MT_FRAME_SIZE, &width, &height)) ||
FAILED(MFSetAttributeSize(output, MF_MT_FRAME_SIZE, width, height)) ||
FAILED(MFGetAttributeRatio(type, MF_MT_FRAME_RATE, &numerator, &denominator)) ||
FAILED(MFSetAttributeRatio(output, MF_MT_PIXEL_ASPECT_RATIO, 1, 1)))
{
Error(_grabber->_log, "Setting output media type attributes failed");
goto done;
}
// Set transform output type
_hrStatus = _transform->SetOutputType(0, output, 0);
if (FAILED(_hrStatus))
{
_com_error error(_hrStatus);
Error(_grabber->_log, "Setting the output media type failed => %s", error.ErrorMessage());
goto done;
}
// Check if encoder parameters set properly
_hrStatus = _transform->GetInputStatus(0, &mftStatus);
if (FAILED(_hrStatus))
{
_com_error error(_hrStatus);
Error(_grabber->_log, "Failed to query the input stream for more data => %s", error.ErrorMessage());
goto done;
}
if (MFT_INPUT_STATUS_ACCEPT_DATA == mftStatus)
{
// Notify the transform we are about to begin streaming data
if (FAILED(_transform->ProcessMessage(MFT_MESSAGE_COMMAND_FLUSH, 0)) ||
FAILED(_transform->ProcessMessage(MFT_MESSAGE_NOTIFY_BEGIN_STREAMING, 0)) ||
FAILED(_transform->ProcessMessage(MFT_MESSAGE_NOTIFY_START_OF_STREAM, 0)))
{
Error(_grabber->_log, "Failed to begin streaming data");
}
}
done:
SAFE_RELEASE(output);
return _hrStatus;
}
BOOL SourceReaderCB::isBusy()
{
EnterCriticalSection(&_critsec);
BOOL result = _isBusy;
LeaveCriticalSection(&_critsec);
return result;
}
STDMETHODIMP OnEvent(DWORD, IMFMediaEvent*) { return S_OK; }
STDMETHODIMP OnFlush(DWORD) { return S_OK; }
private:
virtual ~SourceReaderCB()
{
if (_transform)
{
_transform->ProcessMessage(MFT_MESSAGE_NOTIFY_END_OF_STREAM, 0);
_transform->ProcessMessage(MFT_MESSAGE_NOTIFY_END_STREAMING, 0);
}
SAFE_RELEASE(_transform);
// Delete critical section.
DeleteCriticalSection(&_critsec);
}
IMFSample* SourceReaderCB::TransformSample(IMFTransform* transform, IMFSample* in_sample)
{
IMFSample* result = nullptr;
IMFMediaBuffer* out_buffer = nullptr;
MFT_OUTPUT_DATA_BUFFER outputDataBuffer = { 0 };
// Process the input sample
_hrStatus = transform->ProcessInput(0, in_sample, 0);
if (FAILED(_hrStatus))
{
_com_error error(_hrStatus);
Error(_grabber->_log, "Failed to process the input sample => %s", error.ErrorMessage());
goto done;
}
// Gets the buffer demand for the output stream
MFT_OUTPUT_STREAM_INFO streamInfo;
_hrStatus = transform->GetOutputStreamInfo(0, &streamInfo);
if (FAILED(_hrStatus))
{
_com_error error(_hrStatus);
Error(_grabber->_log, "Failed to retrieve buffer requirement for output current => %s", error.ErrorMessage());
goto done;
}
// Create an output media buffer
_hrStatus = MFCreateMemoryBuffer(streamInfo.cbSize, &out_buffer);
if (FAILED(_hrStatus))
{
_com_error error(_hrStatus);
Error(_grabber->_log, "Failed to create an output media buffer => %s", error.ErrorMessage());
goto done;
}
// Create an empty media sample
_hrStatus = MFCreateSample(&result);
if (FAILED(_hrStatus))
{
_com_error error(_hrStatus);
Error(_grabber->_log, "Failed to create an empty media sample => %s", error.ErrorMessage());
goto done;
}
// Add the output media buffer to the media sample
_hrStatus = result->AddBuffer(out_buffer);
if (FAILED(_hrStatus))
{
_com_error error(_hrStatus);
Error(_grabber->_log, "Failed to add the output media buffer to the media sample => %s", error.ErrorMessage());
goto done;
}
// Create the output buffer structure
memset(&outputDataBuffer, 0, sizeof outputDataBuffer);
outputDataBuffer.dwStreamID = 0;
outputDataBuffer.dwStatus = 0;
outputDataBuffer.pEvents = nullptr;
outputDataBuffer.pSample = result;
DWORD status = 0;
// Generate the output sample
_hrStatus = transform->ProcessOutput(0, 1, &outputDataBuffer, &status);
if (FAILED(_hrStatus))
{
_com_error error(_hrStatus);
Error(_grabber->_log, "Failed to generate the output sample => %s", error.ErrorMessage());
}
else
{
SAFE_RELEASE(out_buffer);
return result;
}
done:
SAFE_RELEASE(out_buffer);
return nullptr;
}
private:
long _nRefCount;
CRITICAL_SECTION _critsec;
MFGrabber* _grabber;
BOOL _bEOS;
HRESULT _hrStatus;
IMFTransform* _transform;
PixelFormat _pixelformat;
std::atomic<bool> _isBusy;
};

File diff suppressed because it is too large Load Diff

View File

@@ -4,21 +4,33 @@
#include <xcb/randr.h>
#include <xcb/xcb_event.h>
X11Grabber::X11Grabber(int cropLeft, int cropRight, int cropTop, int cropBottom, int pixelDecimation)
: Grabber("X11GRABBER", 0, 0, cropLeft, cropRight, cropTop, cropBottom)
// Constants
namespace {
const bool verbose = false;
} //End of constants
X11Grabber::X11Grabber(int cropLeft, int cropRight, int cropTop, int cropBottom)
: Grabber("X11GRABBER", cropLeft, cropRight, cropTop, cropBottom)
, _x11Display(nullptr)
, _xImage(nullptr)
, _pixmap(None)
, _srcFormat(nullptr)
, _dstFormat(nullptr)
, _srcPicture(None)
, _dstPicture(None)
, _pixelDecimation(pixelDecimation)
, _screenWidth(0)
, _screenHeight(0)
, _calculatedWidth(0)
, _calculatedHeight(0)
, _src_x(cropLeft)
, _src_y(cropTop)
, _XShmAvailable(false)
, _XRenderAvailable(false)
, _XRandRAvailable(false)
, _isWayland (false)
, _logger{}
, _image(0,0)
{
_logger = Logger::getInstance("X11");
_useImageResampler = false;
_imageResampler.setCropping(0, 0, 0, 0); // cropping is performed by XRender, XShmGetImage or XGetImage
memset(&_pictAttr, 0, sizeof(_pictAttr));
@@ -37,7 +49,10 @@ X11Grabber::~X11Grabber()
void X11Grabber::freeResources()
{
// Cleanup allocated resources of the X11 grab
XDestroyImage(_xImage);
if (_xImage != nullptr)
{
XDestroyImage(_xImage);
}
if (_XRandRAvailable)
{
qApp->removeNativeEventFilter(this);
@@ -65,7 +80,7 @@ void X11Grabber::setupResources()
if(_XShmAvailable)
{
_xImage = XShmCreateImage(_x11Display, _windowAttr.visual, _windowAttr.depth, ZPixmap, NULL, &_shminfo, _width, _height);
_xImage = XShmCreateImage(_x11Display, _windowAttr.visual, _windowAttr.depth, ZPixmap, NULL, &_shminfo, _calculatedWidth, _calculatedHeight);
_shminfo.shmid = shmget(IPC_PRIVATE, (size_t) _xImage->bytes_per_line * _xImage->height, IPC_CREAT|0777);
_xImage->data = (char*)shmat(_shminfo.shmid,0,0);
_shminfo.shmaddr = _xImage->data;
@@ -75,17 +90,17 @@ void X11Grabber::setupResources()
if (_XRenderAvailable)
{
_useImageResampler = false;
_useImageResampler = false;
_imageResampler.setHorizontalPixelDecimation(1);
_imageResampler.setVerticalPixelDecimation(1);
if(_XShmPixmapAvailable)
{
_pixmap = XShmCreatePixmap(_x11Display, _window, _xImage->data, &_shminfo, _width, _height, _windowAttr.depth);
_pixmap = XShmCreatePixmap(_x11Display, _window, _xImage->data, &_shminfo, _calculatedWidth, _calculatedHeight, _windowAttr.depth);
}
else
{
_pixmap = XCreatePixmap(_x11Display, _window, _width, _height, _windowAttr.depth);
_pixmap = XCreatePixmap(_x11Display, _window, _calculatedWidth, _calculatedHeight, _windowAttr.depth);
}
_srcFormat = XRenderFindVisualFormat(_x11Display, _windowAttr.visual);
_dstFormat = XRenderFindVisualFormat(_x11Display, _windowAttr.visual);
@@ -96,49 +111,82 @@ void X11Grabber::setupResources()
}
else
{
_useImageResampler = true;
_useImageResampler = true;
_imageResampler.setHorizontalPixelDecimation(_pixelDecimation);
_imageResampler.setVerticalPixelDecimation(_pixelDecimation);
}
}
bool X11Grabber::Setup()
bool X11Grabber::open()
{
_x11Display = XOpenDisplay(NULL);
if (_x11Display == nullptr)
bool rc = false;
if (getenv("WAYLAND_DISPLAY") != nullptr)
{
Error(_log, "Unable to open display");
if (getenv("DISPLAY"))
_isWayland = true;
}
else
{
_x11Display = XOpenDisplay(nullptr);
if (_x11Display != nullptr)
{
Error(_log, "%s",getenv("DISPLAY"));
rc = true;
}
}
return rc;
}
bool X11Grabber::setupDisplay()
{
bool result = false;
if ( ! open() )
{
if ( _isWayland )
{
Error(_log, "Grabber does not work under Wayland!");
}
else
{
Error(_log, "DISPLAY environment variable not set");
if (getenv("DISPLAY") != nullptr)
{
Error(_log, "Unable to open display [%s]",getenv("DISPLAY"));
}
else
{
Error(_log, "DISPLAY environment variable not set");
}
}
return false;
}
else
{
_window = DefaultRootWindow(_x11Display);
_window = DefaultRootWindow(_x11Display);
int dummy, pixmaps_supported;
int dummy, pixmaps_supported;
_XRandRAvailable = XRRQueryExtension(_x11Display, &_XRandREventBase, &dummy);
_XRenderAvailable = XRenderQueryExtension(_x11Display, &dummy, &dummy);
_XShmAvailable = XShmQueryExtension(_x11Display);
XShmQueryVersion(_x11Display, &dummy, &dummy, &pixmaps_supported);
_XShmPixmapAvailable = pixmaps_supported && XShmPixmapFormat(_x11Display) == ZPixmap;
_XRandRAvailable = XRRQueryExtension(_x11Display, &_XRandREventBase, &dummy);
_XRenderAvailable = XRenderQueryExtension(_x11Display, &dummy, &dummy);
_XShmAvailable = XShmQueryExtension(_x11Display);
XShmQueryVersion(_x11Display, &dummy, &dummy, &pixmaps_supported);
_XShmPixmapAvailable = pixmaps_supported && XShmPixmapFormat(_x11Display) == ZPixmap;
Info(_log, QString("XRandR=[%1] XRender=[%2] XShm=[%3] XPixmap=[%4]")
.arg(_XRandRAvailable ? "available" : "unavailable")
.arg(_XRenderAvailable ? "available" : "unavailable")
.arg(_XShmAvailable ? "available" : "unavailable")
.arg(_XShmPixmapAvailable ? "available" : "unavailable")
.toStdString().c_str());
bool result = (updateScreenDimensions(true) >=0);
ErrorIf(!result, _log, "X11 Grabber start failed");
setEnabled(result);
result = (updateScreenDimensions(true) >=0);
ErrorIf(!result, _log, "X11 Grabber start failed");
setEnabled(result);
}
return result;
}
int X11Grabber::grabFrame(Image<ColorRgb> & image, bool forceUpdate)
{
if (!_enabled) return 0;
if (!_isEnabled) return 0;
if (forceUpdate)
updateScreenDimensions(forceUpdate);
@@ -176,7 +224,7 @@ int X11Grabber::grabFrame(Image<ColorRgb> & image, bool forceUpdate)
// src_y = cropTop, mask_x, mask_y, dest_x, dest_y, width, height
XRenderComposite(
_x11Display, PictOpSrc, _srcPicture, None, _dstPicture, ( _src_x/_pixelDecimation),
(_src_y/_pixelDecimation), 0, 0, 0, 0, _width, _height);
(_src_y/_pixelDecimation), 0, 0, 0, 0, _calculatedWidth, _calculatedHeight);
XSync(_x11Display, False);
@@ -186,7 +234,7 @@ int X11Grabber::grabFrame(Image<ColorRgb> & image, bool forceUpdate)
}
else
{
_xImage = XGetImage(_x11Display, _pixmap, 0, 0, _width, _height, AllPlanes, ZPixmap);
_xImage = XGetImage(_x11Display, _pixmap, 0, 0, _calculatedWidth, _calculatedHeight, AllPlanes, ZPixmap);
}
}
else if (_XShmAvailable)
@@ -197,7 +245,7 @@ int X11Grabber::grabFrame(Image<ColorRgb> & image, bool forceUpdate)
else
{
// all things done by xgetimage
_xImage = XGetImage(_x11Display, _window, _src_x, _src_y, _width, _height, AllPlanes, ZPixmap);
_xImage = XGetImage(_x11Display, _window, _src_x, _src_y, _calculatedWidth, _calculatedHeight, AllPlanes, ZPixmap);
}
if (_xImage == nullptr)
@@ -220,45 +268,46 @@ int X11Grabber::updateScreenDimensions(bool force)
return -1;
}
if (!force && _screenWidth == unsigned(_windowAttr.width) && _screenHeight == unsigned(_windowAttr.height))
if (!force && _width == _windowAttr.width && _height == _windowAttr.height)
{
// No update required
return 0;
}
if (_screenWidth || _screenHeight)
if (_width || _height)
{
freeResources();
}
Info(_log, "Update of screen resolution: [%dx%d] to [%dx%d]", _screenWidth, _screenHeight, _windowAttr.width, _windowAttr.height);
_screenWidth = _windowAttr.width;
_screenHeight = _windowAttr.height;
Info(_log, "Update of screen resolution: [%dx%d] to [%dx%d]", _width, _height, _windowAttr.width, _windowAttr.height);
_width = _windowAttr.width;
_height = _windowAttr.height;
int width=0, height=0;
int width=0;
int height=0;
// Image scaling is performed by XRender when available, otherwise by ImageResampler
if (_XRenderAvailable)
{
width = (_screenWidth > unsigned(_cropLeft + _cropRight))
? ((_screenWidth - _cropLeft - _cropRight) / _pixelDecimation)
: _screenWidth / _pixelDecimation;
width = (_width > (_cropLeft + _cropRight))
? ((_width - _cropLeft - _cropRight) / _pixelDecimation)
: _width / _pixelDecimation;
height = (_screenHeight > unsigned(_cropTop + _cropBottom))
? ((_screenHeight - _cropTop - _cropBottom) / _pixelDecimation)
: _screenHeight / _pixelDecimation;
height = (_height > (_cropTop + _cropBottom))
? ((_height - _cropTop - _cropBottom) / _pixelDecimation)
: _height / _pixelDecimation;
Info(_log, "Using XRender for grabbing");
}
else
{
width = (_screenWidth > unsigned(_cropLeft + _cropRight))
? (_screenWidth - _cropLeft - _cropRight)
: _screenWidth;
width = (_width > (_cropLeft + _cropRight))
? (_width - _cropLeft - _cropRight)
: _width;
height = (_screenHeight > unsigned(_cropTop + _cropBottom))
? (_screenHeight - _cropTop - _cropBottom)
: _screenHeight;
height = (_height > (_cropTop + _cropBottom))
? (_height - _cropTop - _cropBottom)
: _height;
Info(_log, "Using XGetImage for grabbing");
}
@@ -267,29 +316,29 @@ int X11Grabber::updateScreenDimensions(bool force)
switch (_videoMode)
{
case VideoMode::VIDEO_3DSBS:
_width = width /2;
_height = height;
_calculatedWidth = width /2;
_calculatedHeight = height;
_src_x = _cropLeft / 2;
_src_y = _cropTop;
break;
case VideoMode::VIDEO_3DTAB:
_width = width;
_height = height / 2;
_calculatedWidth = width;
_calculatedHeight = height / 2;
_src_x = _cropLeft;
_src_y = _cropTop / 2;
break;
case VideoMode::VIDEO_2D:
default:
_width = width;
_height = height;
_calculatedWidth = width;
_calculatedHeight = height;
_src_x = _cropLeft;
_src_y = _cropTop;
break;
}
Info(_log, "Update output image resolution: [%dx%d] to [%dx%d]", _image.width(), _image.height(), _width, _height);
Info(_log, "Update output image resolution: [%dx%d] to [%dx%d]", _image.width(), _image.height(), _calculatedWidth, _calculatedHeight);
_image.resize(_width, _height);
_image.resize(_calculatedWidth, _calculatedHeight);
setupResources();
return 1;
@@ -298,22 +347,35 @@ int X11Grabber::updateScreenDimensions(bool force)
void X11Grabber::setVideoMode(VideoMode mode)
{
Grabber::setVideoMode(mode);
updateScreenDimensions(true);
}
void X11Grabber::setPixelDecimation(int pixelDecimation)
{
if(_pixelDecimation != pixelDecimation)
if(_x11Display != nullptr)
{
_pixelDecimation = pixelDecimation;
updateScreenDimensions(true);
}
}
void X11Grabber::setCropping(unsigned cropLeft, unsigned cropRight, unsigned cropTop, unsigned cropBottom)
bool X11Grabber::setPixelDecimation(int pixelDecimation)
{
bool rc (true);
if (Grabber::setPixelDecimation(pixelDecimation))
{
if(_x11Display != nullptr)
{
if ( updateScreenDimensions(true) < 0 )
{
rc = false;
}
}
}
return rc;
}
void X11Grabber::setCropping(int cropLeft, int cropRight, int cropTop, int cropBottom)
{
Grabber::setCropping(cropLeft, cropRight, cropTop, cropBottom);
if(_x11Display != nullptr) updateScreenDimensions(true); // segfault on init
if(_x11Display != nullptr)
{
updateScreenDimensions(true); // segfault on init
}
}
bool X11Grabber::nativeEventFilter(const QByteArray & eventType, void * message, long int * /*result*/)
@@ -332,3 +394,78 @@ bool X11Grabber::nativeEventFilter(const QByteArray & eventType, void * message,
return false;
}
QJsonObject X11Grabber::discover(const QJsonObject& params)
{
DebugIf(verbose, _log, "params: [%s]", QString(QJsonDocument(params).toJson(QJsonDocument::Compact)).toUtf8().constData());
QJsonObject inputsDiscovered;
if ( open() )
{
inputsDiscovered["device"] = "x11";
inputsDiscovered["device_name"] = "X11";
inputsDiscovered["type"] = "screen";
QJsonArray video_inputs;
if (_x11Display != nullptr)
{
QJsonArray fps = { 1, 5, 10, 15, 20, 25, 30, 40, 50, 60 };
// Iterate through all X screens
for (int i = 0; i < XScreenCount(_x11Display); ++i)
{
_window = DefaultRootWindow(_x11Display);
const Status status = XGetWindowAttributes(_x11Display, _window, &_windowAttr);
if (status == 0)
{
Debug(_log, "Failed to obtain window attributes");
}
else
{
QJsonObject in;
QString displayName;
char* name;
if ( XFetchName(_x11Display, _window, &name) > 0 )
{
displayName = name;
}
else {
displayName = QString("Display:%1").arg(i);
}
in["name"] = displayName;
in["inputIdx"] = i;
QJsonArray formats;
QJsonArray resolutionArray;
QJsonObject format;
QJsonObject resolution;
resolution["width"] = _windowAttr.width;
resolution["height"] = _windowAttr.height;
resolution["fps"] = fps;
resolutionArray.append(resolution);
format["resolutions"] = resolutionArray;
formats.append(format);
in["formats"] = formats;
video_inputs.append(in);
}
}
if ( !video_inputs.isEmpty() )
{
inputsDiscovered["video_inputs"] = video_inputs;
}
}
}
DebugIf(verbose, _log, "device: [%s]", QString(QJsonDocument(inputsDiscovered).toJson(QJsonDocument::Compact)).toUtf8().constData());
return inputsDiscovered;
}

View File

@@ -1,10 +1,14 @@
#include <grabber/X11Wrapper.h>
X11Wrapper::X11Wrapper(int cropLeft, int cropRight, int cropTop, int cropBottom, int pixelDecimation, unsigned updateRate_Hz)
: GrabberWrapper("X11", &_grabber, 0, 0, updateRate_Hz)
, _grabber(cropLeft, cropRight, cropTop, cropBottom, pixelDecimation)
, _init(false)
{}
X11Wrapper::X11Wrapper( int updateRate_Hz,
int pixelDecimation,
int cropLeft, int cropRight, int cropTop, int cropBottom)
: GrabberWrapper("X11", &_grabber, updateRate_Hz)
, _grabber(cropLeft, cropRight, cropTop, cropBottom)
, _init(false)
{
_grabber.setPixelDecimation(pixelDecimation);
}
X11Wrapper::~X11Wrapper()
{
@@ -19,7 +23,7 @@ void X11Wrapper::action()
if (! _init )
{
_init = true;
if ( ! _grabber.Setup() )
if ( ! _grabber.setupDisplay() )
{
stop();
}

View File

@@ -22,7 +22,7 @@ void check_error(xcb_generic_error_t * error)
// Requests with void response type
template<class Request, class ...Args>
typename std::enable_if<std::is_same<typename Request::ResponseType, xcb_void_cookie_t>::value, void>::type
query(xcb_connection_t * connection, Args&& ...args)
static query(xcb_connection_t * connection, Args&& ...args)
{
auto cookie = Request::RequestFunction(connection, std::forward<Args>(args)...);
@@ -33,9 +33,8 @@ template<class Request, class ...Args>
// Requests with non-void response type
template<class Request, class ...Args>
typename std::enable_if<!std::is_same<typename Request::ResponseType, xcb_void_cookie_t>::value,
std::unique_ptr<typename Request::ResponseType, decltype(&free)>>::type
query(xcb_connection_t * connection, Args&& ...args)
typename std::enable_if<!std::is_same<typename Request::ResponseType, xcb_void_cookie_t>::value, std::unique_ptr<typename Request::ResponseType, decltype(&free)>>::type
static query(xcb_connection_t * connection, Args&& ...args)
{
auto cookie = Request::RequestFunction(connection, std::forward<Args>(args)...);

View File

@@ -21,6 +21,14 @@ struct GetGeometry
static constexpr auto ReplyFunction = xcb_get_geometry_reply;
};
struct GetProperty
{
typedef xcb_get_property_reply_t ResponseType;
static constexpr auto RequestFunction = xcb_get_property;
static constexpr auto ReplyFunction = xcb_get_property_reply;
};
struct ShmQueryVersion
{
typedef xcb_shm_query_version_reply_t ResponseType;

View File

@@ -14,10 +14,15 @@
#include <memory>
// Constants
namespace {
const bool verbose = false;
} //End of constants
#define DOUBLE_TO_FIXED(d) ((xcb_render_fixed_t) ((d) * 65536))
XcbGrabber::XcbGrabber(int cropLeft, int cropRight, int cropTop, int cropBottom, int pixelDecimation)
: Grabber("XCBGRABBER", 0, 0, cropLeft, cropRight, cropTop, cropBottom)
XcbGrabber::XcbGrabber(int cropLeft, int cropRight, int cropTop, int cropBottom)
: Grabber("XCBGRABBER", cropLeft, cropRight, cropTop, cropBottom)
, _connection{}
, _screen{}
, _pixmap{}
@@ -27,7 +32,6 @@ XcbGrabber::XcbGrabber(int cropLeft, int cropRight, int cropTop, int cropBottom,
, _dstPicture{}
, _transform{}
, _shminfo{}
, _pixelDecimation(pixelDecimation)
, _screenWidth{}
, _screenHeight{}
, _src_x(cropLeft)
@@ -36,6 +40,7 @@ XcbGrabber::XcbGrabber(int cropLeft, int cropRight, int cropTop, int cropBottom,
, _XcbRandRAvailable{}
, _XcbShmAvailable{}
, _XcbShmPixmapAvailable{}
, _isWayland (false)
, _logger{}
, _shmData{}
, _XcbRandREventBase{-1}
@@ -181,54 +186,83 @@ void XcbGrabber::setupShm()
}
}
bool XcbGrabber::Setup()
bool XcbGrabber::open()
{
int screen_num;
_connection = xcb_connect(nullptr, &screen_num);
bool rc = false;
int ret = xcb_connection_has_error(_connection);
if (ret != 0)
if (getenv("WAYLAND_DISPLAY") != nullptr)
{
Error(_logger, "Cannot open display, error %d", ret);
return false;
_isWayland = true;
}
const xcb_setup_t * setup = xcb_get_setup(_connection);
_screen = getScreen(setup, screen_num);
if (!_screen)
else
{
Error(_log, "Unable to open display, screen %d does not exist", screen_num);
_connection = xcb_connect(nullptr, &_screen_num);
if (getenv("DISPLAY"))
Error(_log, "%s", getenv("DISPLAY"));
int ret = xcb_connection_has_error(_connection);
if (ret != 0)
{
Debug(_logger, "Cannot open display, error %d", ret);
}
else
Error(_log, "DISPLAY environment variable not set");
freeResources();
return false;
{
const xcb_setup_t * setup = xcb_get_setup(_connection);
_screen = getScreen(setup, _screen_num);
if ( _screen != nullptr)
{
rc = true;
}
}
}
setupRandr();
setupRender();
setupShm();
return rc;
}
Info(_log, QString("XcbRandR=[%1] XcbRender=[%2] XcbShm=[%3] XcbPixmap=[%4]")
.arg(_XcbRandRAvailable ? "available" : "unavailable")
.arg(_XcbRenderAvailable ? "available" : "unavailable")
.arg(_XcbShmAvailable ? "available" : "unavailable")
.arg(_XcbShmPixmapAvailable ? "available" : "unavailable")
.toStdString().c_str());
bool XcbGrabber::setupDisplay()
{
bool result = false;
bool result = (updateScreenDimensions(true) >= 0);
ErrorIf(!result, _log, "XCB Grabber start failed");
setEnabled(result);
if ( ! open() )
{
if ( _isWayland )
{
Error(_log, "Grabber does not work under Wayland!");
}
else
{
if (getenv("DISPLAY") != nullptr)
{
Error(_log, "Unable to open display [%s], screen %d does not exist", getenv("DISPLAY"), _screen_num);
}
else
{
Error(_log, "DISPLAY environment variable not set");
}
freeResources();
}
}
else
{
setupRandr();
setupRender();
setupShm();
Info(_log, QString("XcbRandR=[%1] XcbRender=[%2] XcbShm=[%3] XcbPixmap=[%4]")
.arg(_XcbRandRAvailable ? "available" : "unavailable")
.arg(_XcbRenderAvailable ? "available" : "unavailable")
.arg(_XcbShmAvailable ? "available" : "unavailable")
.arg(_XcbShmPixmapAvailable ? "available" : "unavailable")
.toStdString().c_str());
result = (updateScreenDimensions(true) >= 0);
ErrorIf(!result, _log, "XCB Grabber start failed");
setEnabled(result);
}
return result;
}
int XcbGrabber::grabFrame(Image<ColorRgb> & image, bool forceUpdate)
{
if (!_enabled)
if (!_isEnabled)
return 0;
if (forceUpdate)
@@ -316,7 +350,7 @@ int XcbGrabber::updateScreenDimensions(bool force)
return -1;
}
if (!_enabled)
if (!_isEnabled)
setEnabled(true);
if (!force && _screenWidth == unsigned(geometry->width) &&
@@ -391,19 +425,29 @@ int XcbGrabber::updateScreenDimensions(bool force)
void XcbGrabber::setVideoMode(VideoMode mode)
{
Grabber::setVideoMode(mode);
updateScreenDimensions(true);
}
void XcbGrabber::setPixelDecimation(int pixelDecimation)
{
if(_pixelDecimation != pixelDecimation)
if(_connection != nullptr)
{
_pixelDecimation = pixelDecimation;
updateScreenDimensions(true);
}
}
void XcbGrabber::setCropping(unsigned cropLeft, unsigned cropRight, unsigned cropTop, unsigned cropBottom)
bool XcbGrabber::setPixelDecimation(int pixelDecimation)
{
bool rc (true);
if (Grabber::setPixelDecimation(pixelDecimation))
{
if(_connection != nullptr)
{
if ( updateScreenDimensions(true) < 0 )
{
rc = false;
}
}
}
return rc;
}
void XcbGrabber::setCropping(int cropLeft, int cropRight, int cropTop, int cropBottom)
{
Grabber::setCropping(cropLeft, cropRight, cropTop, cropBottom);
if(_connection != nullptr)
@@ -459,3 +503,89 @@ xcb_render_pictformat_t XcbGrabber::findFormatForVisual(xcb_visualid_t visual) c
}
return {};
}
QJsonObject XcbGrabber::discover(const QJsonObject& params)
{
DebugIf(verbose, _log, "params: [%s]", QString(QJsonDocument(params).toJson(QJsonDocument::Compact)).toUtf8().constData());
QJsonObject inputsDiscovered;
if ( open() )
{
inputsDiscovered["device"] = "xcb";
inputsDiscovered["device_name"] = "XCB";
inputsDiscovered["type"] = "screen";
QJsonArray video_inputs;
if (_connection != nullptr && _screen != nullptr )
{
QJsonArray fps = { 1, 5, 10, 15, 20, 25, 30, 40, 50, 60 };
const xcb_setup_t * setup = xcb_get_setup(_connection);
xcb_screen_iterator_t it = xcb_setup_roots_iterator(setup);
xcb_screen_t * screen = nullptr;
int i = 0;
// Iterate through all X screens
for (; it.rem > 0; xcb_screen_next(&it))
{
screen = it.data;
auto geometry = query<GetGeometry>(_connection, screen->root);
if (geometry == nullptr)
{
Debug(_log, "Failed to obtain screen geometry for screen [%d]", i);
}
else
{
QJsonObject in;
QString displayName;
auto property = query<GetProperty>(_connection, 0, screen->root, XCB_ATOM_WM_NAME, XCB_ATOM_STRING, 0, 0);
if ( property != nullptr )
{
if ( xcb_get_property_value_length(property.get()) > 0 )
{
displayName = (char *) xcb_get_property_value(property.get());
}
}
if (displayName.isEmpty())
{
displayName = QString("Display:%1").arg(i);
}
in["name"] = displayName;
in["inputIdx"] = i;
QJsonArray formats;
QJsonArray resolutionArray;
QJsonObject format;
QJsonObject resolution;
resolution["width"] = geometry->width;
resolution["height"] = geometry->height;
resolution["fps"] = fps;
resolutionArray.append(resolution);
format["resolutions"] = resolutionArray;
formats.append(format);
in["formats"] = formats;
video_inputs.append(in);
}
++i;
}
if ( !video_inputs.isEmpty() )
{
inputsDiscovered["video_inputs"] = video_inputs;
}
}
}
DebugIf(verbose, _log, "device: [%s]", QString(QJsonDocument(inputsDiscovered).toJson(QJsonDocument::Compact)).toUtf8().constData());
return inputsDiscovered;
}

View File

@@ -1,10 +1,14 @@
#include <grabber/XcbWrapper.h>
XcbWrapper::XcbWrapper(int cropLeft, int cropRight, int cropTop, int cropBottom, int pixelDecimation, const unsigned updateRate_Hz)
: GrabberWrapper("Xcb", &_grabber, 0, 0, updateRate_Hz)
, _grabber(cropLeft, cropRight, cropTop, cropBottom, pixelDecimation)
XcbWrapper::XcbWrapper( int updateRate_Hz,
int pixelDecimation,
int cropLeft, int cropRight, int cropTop, int cropBottom)
: GrabberWrapper("Xcb", &_grabber, updateRate_Hz)
, _grabber(cropLeft, cropRight, cropTop, cropBottom)
, _init(false)
{}
{
_grabber.setPixelDecimation(pixelDecimation);
}
XcbWrapper::~XcbWrapper()
{
@@ -19,7 +23,7 @@ void XcbWrapper::action()
if (! _init )
{
_init = true;
if ( ! _grabber.Setup() )
if ( ! _grabber.setupDisplay() )
{
stop();
}

View File

@@ -47,6 +47,7 @@ void CaptureCont::handleV4lImage(const QString& name, const Image<ColorRgb> & im
{
_hyperion->registerInput(_v4lCaptPrio, hyperion::COMP_V4L, "System", name);
_v4lCaptName = name;
emit GlobalSignals::getInstance()->requestSource(hyperion::COMP_V4L, int(_hyperion->getInstanceIndex()), _v4lCaptEnabled);
}
_v4lInactiveTimer->start();
_hyperion->setInputImage(_v4lCaptPrio, image);
@@ -58,6 +59,7 @@ void CaptureCont::handleSystemImage(const QString& name, const Image<ColorRgb>&
{
_hyperion->registerInput(_systemCaptPrio, hyperion::COMP_GRABBER, "System", name);
_systemCaptName = name;
emit GlobalSignals::getInstance()->requestSource(hyperion::COMP_GRABBER, int(_hyperion->getInstanceIndex()), _systemCaptEnabled);
}
_systemInactiveTimer->start();
_hyperion->setInputImage(_systemCaptPrio, image);
@@ -75,7 +77,7 @@ void CaptureCont::setSystemCaptureEnable(bool enable)
}
else
{
disconnect(GlobalSignals::getInstance(), &GlobalSignals::setSystemImage, 0, 0);
disconnect(GlobalSignals::getInstance(), &GlobalSignals::setSystemImage, this, 0);
_hyperion->clear(_systemCaptPrio);
_systemInactiveTimer->stop();
_systemCaptName = "";
@@ -98,7 +100,7 @@ void CaptureCont::setV4LCaptureEnable(bool enable)
}
else
{
disconnect(GlobalSignals::getInstance(), &GlobalSignals::setV4lImage, 0, 0);
disconnect(GlobalSignals::getInstance(), &GlobalSignals::setV4lImage, this, 0);
_hyperion->clear(_v4lCaptPrio);
_v4lInactiveTimer->stop();
_v4lCaptName = "";
@@ -125,8 +127,8 @@ void CaptureCont::handleSettingsUpdate(settings::type type, const QJsonDocument&
_systemCaptPrio = obj["systemPriority"].toInt(250);
}
setV4LCaptureEnable(obj["v4lEnable"].toBool(true));
setSystemCaptureEnable(obj["systemEnable"].toBool(true));
setV4LCaptureEnable(obj["v4lEnable"].toBool(false));
setSystemCaptureEnable(obj["systemEnable"].toBool(false));
}
}

View File

@@ -1,33 +1,46 @@
#include <hyperion/Grabber.h>
#include <hyperion/GrabberWrapper.h>
Grabber::Grabber(const QString& grabberName, int width, int height, int cropLeft, int cropRight, int cropTop, int cropBottom)
: _imageResampler()
Grabber::Grabber(const QString& grabberName, int cropLeft, int cropRight, int cropTop, int cropBottom)
: _grabberName(grabberName)
, _log(Logger::getInstance(_grabberName.toUpper()))
, _useImageResampler(true)
, _videoMode(VideoMode::VIDEO_2D)
, _width(width)
, _height(height)
, _fps(15)
, _videoStandard(VideoStandard::NO_CHANGE)
, _pixelDecimation(GrabberWrapper::DEFAULT_PIXELDECIMATION)
, _flipMode(FlipMode::NO_CHANGE)
, _width(0)
, _height(0)
, _fps(GrabberWrapper::DEFAULT_RATE_HZ)
, _fpsSoftwareDecimation(0)
, _input(-1)
, _cropLeft(0)
, _cropRight(0)
, _cropTop(0)
, _cropBottom(0)
, _enabled(true)
, _log(Logger::getInstance(grabberName.toUpper()))
, _isEnabled(true)
, _isDeviceInError(false)
{
Grabber::setVideoMode(VideoMode::VIDEO_2D);
Grabber::setCropping(cropLeft, cropRight, cropTop, cropBottom);
}
void Grabber::setEnabled(bool enable)
{
Info(_log,"Capture interface is now %s", enable ? "enabled" : "disabled");
_enabled = enable;
_isEnabled = enable;
}
void Grabber::setInError(const QString& errorMsg)
{
_isDeviceInError = true;
_isEnabled = false;
Error(_log, "Grabber disabled, device '%s' signals error: '%s'", QSTRING_CSTR(_grabberName), QSTRING_CSTR(errorMsg));
}
void Grabber::setVideoMode(VideoMode mode)
{
Debug(_log,"Set videomode to %d", mode);
Info(_log,"Set videomode to %s", QSTRING_CSTR(videoMode2String(mode)));
_videoMode = mode;
if ( _useImageResampler )
{
@@ -35,11 +48,46 @@ void Grabber::setVideoMode(VideoMode mode)
}
}
void Grabber::setCropping(unsigned cropLeft, unsigned cropRight, unsigned cropTop, unsigned cropBottom)
void Grabber::setVideoStandard(VideoStandard videoStandard)
{
if (_videoStandard != videoStandard) {
_videoStandard = videoStandard;
}
}
bool Grabber::setPixelDecimation(int pixelDecimation)
{
if (_pixelDecimation != pixelDecimation)
{
Info(_log,"Set image size decimation to %d", pixelDecimation);
_pixelDecimation = pixelDecimation;
if ( _useImageResampler )
{
_imageResampler.setHorizontalPixelDecimation(pixelDecimation);
_imageResampler.setVerticalPixelDecimation(pixelDecimation);
}
return true;
}
return false;
}
void Grabber::setFlipMode(FlipMode mode)
{
Info(_log,"Set flipmode to %s", QSTRING_CSTR(flipModeToString(mode)));
_flipMode = mode;
if ( _useImageResampler )
{
_imageResampler.setFlipMode(_flipMode);
}
}
void Grabber::setCropping(int cropLeft, int cropRight, int cropTop, int cropBottom)
{
if (_width>0 && _height>0)
{
if (cropLeft + cropRight >= (unsigned)_width || cropTop + cropBottom >= (unsigned)_height)
if (cropLeft + cropRight >= _width || cropTop + cropBottom >= _height)
{
Error(_log, "Rejecting invalid crop values: left: %d, right: %d, top: %d, bottom: %d, higher than height/width %d/%d", cropLeft, cropRight, cropTop, cropBottom, _height, _width);
return;
@@ -79,29 +127,45 @@ bool Grabber::setInput(int input)
bool Grabber::setWidthHeight(int width, int height)
{
bool rc (false);
// eval changes with crop
if ( (width>0 && height>0) && (_width != width || _height != height) )
{
if (_cropLeft + _cropRight >= width || _cropTop + _cropBottom >= height)
{
Error(_log, "Rejecting invalid width/height values as it collides with image cropping: width: %d, height: %d", width, height);
return false;
rc = false;
}
else
{
Debug(_log, "Set new width: %d, height: %d for capture", width, height);
_width = width;
_height = height;
rc = true;
}
Debug(_log, "Set new width: %d, height: %d for capture", width, height);
_width = width;
_height = height;
return true;
}
return false;
return rc;
}
bool Grabber::setFramerate(int fps)
{
if((fps > 0) && (_fps != fps))
{
Info(_log,"Set new frames per second to: %i fps", fps);
_fps = fps;
return true;
}
return false;
}
void Grabber::setFpsSoftwareDecimation(int decimation)
{
if((_fpsSoftwareDecimation != decimation))
{
_fpsSoftwareDecimation = decimation;
if(decimation > 0){
Debug(_log,"Skip %i frame per second", decimation);
}
}
}

View File

@@ -10,22 +10,31 @@
#include <QTimer>
GrabberWrapper* GrabberWrapper::instance = nullptr;
const int GrabberWrapper::DEFAULT_RATE_HZ = 10;
const int GrabberWrapper::DEFAULT_MIN_GRAB_RATE_HZ = 1;
const int GrabberWrapper::DEFAULT_MAX_GRAB_RATE_HZ = 30;
const int GrabberWrapper::DEFAULT_PIXELDECIMATION = 8;
GrabberWrapper::GrabberWrapper(const QString& grabberName, Grabber * ggrabber, unsigned width, unsigned height, unsigned updateRate_Hz)
/// Map of Hyperion instances with grabber name that requested screen capture
QMap<int, QString> GrabberWrapper::GRABBER_SYS_CLIENTS = QMap<int, QString>();
QMap<int, QString> GrabberWrapper::GRABBER_V4L_CLIENTS = QMap<int, QString>();
bool GrabberWrapper::GLOBAL_GRABBER_SYS_ENABLE = false;
bool GrabberWrapper::GLOBAL_GRABBER_V4L_ENABLE = false;
GrabberWrapper::GrabberWrapper(const QString& grabberName, Grabber * ggrabber, int updateRate_Hz)
: _grabberName(grabberName)
, _timer(new QTimer(this))
, _updateInterval_ms(1000/updateRate_Hz)
, _log(Logger::getInstance(grabberName))
, _ggrabber(ggrabber)
, _image(0,0)
, _log(Logger::getInstance(grabberName.toUpper()))
, _timer(new QTimer(this))
, _updateInterval_ms(1000/updateRate_Hz)
, _ggrabber(ggrabber)
, _image(0,0)
{
GrabberWrapper::instance = this;
// Configure the timer to generate events every n milliseconds
_timer->setTimerType(Qt::PreciseTimer);
_timer->setInterval(_updateInterval_ms);
_image.resize(width, height);
connect(_timer, &QTimer::timeout, this, &GrabberWrapper::action);
// connect the image forwarding
@@ -44,17 +53,26 @@ GrabberWrapper::~GrabberWrapper()
bool GrabberWrapper::start()
{
// Start the timer with the pre configured interval
Debug(_log,"Grabber start()");
_timer->start();
return _timer->isActive();
bool rc = false;
if ( open() )
{
if (!_timer->isActive())
{
// Start the timer with the pre configured interval
Debug(_log,"Grabber start()");
_timer->start();
}
rc = _timer->isActive();
}
return rc;
}
void GrabberWrapper::stop()
{
if (_timer->isActive())
{
// Stop the timer, effectivly stopping the process
// Stop the timer, effectively stopping the process
Debug(_log,"Grabber stop()");
_timer->stop();
}
@@ -65,50 +83,58 @@ bool GrabberWrapper::isActive() const
return _timer->isActive();
}
QString GrabberWrapper::getActive() const
QStringList GrabberWrapper::getActive(int inst) const
{
return _grabberName;
QStringList result = QStringList();
if(GRABBER_V4L_CLIENTS.contains(inst))
result << GRABBER_V4L_CLIENTS.value(inst);
if(GRABBER_SYS_CLIENTS.contains(inst))
result << GRABBER_SYS_CLIENTS.value(inst);
return result;
}
QStringList GrabberWrapper::availableGrabbers()
{
QStringList grabbers;
#ifdef ENABLE_DISPMANX
#ifdef ENABLE_DISPMANX
grabbers << "dispmanx";
#endif
#endif
#ifdef ENABLE_V4L2
#if defined(ENABLE_V4L2) || defined(ENABLE_MF)
grabbers << "v4l2";
#endif
#endif
#ifdef ENABLE_FB
#ifdef ENABLE_FB
grabbers << "framebuffer";
#endif
#endif
#ifdef ENABLE_AMLOGIC
#ifdef ENABLE_AMLOGIC
grabbers << "amlogic";
#endif
#endif
#ifdef ENABLE_OSX
#ifdef ENABLE_OSX
grabbers << "osx";
#endif
#endif
#ifdef ENABLE_X11
#ifdef ENABLE_X11
grabbers << "x11";
#endif
#endif
#ifdef ENABLE_XCB
#ifdef ENABLE_XCB
grabbers << "xcb";
#endif
#endif
#ifdef ENABLE_QT
#ifdef ENABLE_QT
grabbers << "qt";
#endif
#endif
#ifdef ENABLE_DX
grabbers << "dx";
#endif
#ifdef ENABLE_DX
grabbers << "dx";
#endif
return grabbers;
}
@@ -117,12 +143,17 @@ void GrabberWrapper::setVideoMode(VideoMode mode)
{
if (_ggrabber != nullptr)
{
Info(_log,"setvideomode");
Info(_log,"setVideoMode");
_ggrabber->setVideoMode(mode);
}
}
void GrabberWrapper::setCropping(unsigned cropLeft, unsigned cropRight, unsigned cropTop, unsigned cropBottom)
void GrabberWrapper::setFlipMode(const QString& flipMode)
{
_ggrabber->setFlipMode(parseFlipMode(flipMode));
}
void GrabberWrapper::setCropping(int cropLeft, int cropRight, int cropTop, int cropBottom)
{
_ggrabber->setCropping(cropLeft, cropRight, cropTop, cropBottom);
}
@@ -143,33 +174,40 @@ void GrabberWrapper::updateTimer(int interval)
}
void GrabberWrapper::handleSettingsUpdate(settings::type type, const QJsonDocument& config)
{
if(type == settings::SYSTEMCAPTURE && !_grabberName.startsWith("V4L"))
{ if(type == settings::SYSTEMCAPTURE && !_grabberName.startsWith("V4L"))
{
// extract settings
const QJsonObject& obj = config.object();
// width/height
_ggrabber->setWidthHeight(obj["width"].toInt(96), obj["height"].toInt(96));
// set global grabber state
setSysGrabberState(obj["enable"].toBool(false));
// display index for MAC
_ggrabber->setDisplayIndex(obj["display"].toInt(0));
if (getSysGrabberState())
{
// width/height
_ggrabber->setWidthHeight(obj["width"].toInt(96), obj["height"].toInt(96));
// device path for Framebuffer
_ggrabber->setDevicePath(obj["device"].toString("/dev/fb0"));
// display index for MAC
_ggrabber->setDisplayIndex(obj["input"].toInt(0));
// pixel decimation for x11
_ggrabber->setPixelDecimation(obj["pixelDecimation"].toInt(8));
// pixel decimation for x11
_ggrabber->setPixelDecimation(obj["pixelDecimation"].toInt(DEFAULT_PIXELDECIMATION));
// crop for system capture
_ggrabber->setCropping(
obj["cropLeft"].toInt(0),
obj["cropRight"].toInt(0),
obj["cropTop"].toInt(0),
obj["cropBottom"].toInt(0));
// crop for system capture
_ggrabber->setCropping(
obj["cropLeft"].toInt(0),
obj["cropRight"].toInt(0),
obj["cropTop"].toInt(0),
obj["cropBottom"].toInt(0));
// eval new update time
updateTimer(1000/obj["frequency_Hz"].toInt(10));
_ggrabber->setFramerate(obj["fps"].toInt(DEFAULT_RATE_HZ));
// eval new update time
updateTimer(_ggrabber->getUpdateInterval());
}
else
{
stop();
}
}
}
@@ -177,24 +215,24 @@ void GrabberWrapper::handleSourceRequest(hyperion::Components component, int hyp
{
if(component == hyperion::Components::COMP_GRABBER && !_grabberName.startsWith("V4L"))
{
if(listen && !GRABBER_SYS_CLIENTS.contains(hyperionInd))
GRABBER_SYS_CLIENTS.append(hyperionInd);
else if (!listen)
GRABBER_SYS_CLIENTS.removeOne(hyperionInd);
if(listen)
GRABBER_SYS_CLIENTS.insert(hyperionInd, _grabberName);
else
GRABBER_SYS_CLIENTS.remove(hyperionInd);
if(GRABBER_SYS_CLIENTS.empty())
if(GRABBER_SYS_CLIENTS.empty() || !getSysGrabberState())
stop();
else
start();
}
else if(component == hyperion::Components::COMP_V4L && _grabberName.startsWith("V4L"))
{
if(listen && !GRABBER_V4L_CLIENTS.contains(hyperionInd))
GRABBER_V4L_CLIENTS.append(hyperionInd);
else if (!listen)
GRABBER_V4L_CLIENTS.removeOne(hyperionInd);
if(listen)
GRABBER_V4L_CLIENTS.insert(hyperionInd, _grabberName);
else
GRABBER_V4L_CLIENTS.remove(hyperionInd);
if(GRABBER_V4L_CLIENTS.empty())
if(GRABBER_V4L_CLIENTS.empty() || !getV4lGrabberState())
stop();
else
start();
@@ -204,48 +242,6 @@ void GrabberWrapper::handleSourceRequest(hyperion::Components component, int hyp
void GrabberWrapper::tryStart()
{
// verify start condition
if((_grabberName.startsWith("V4L") && !GRABBER_V4L_CLIENTS.empty()) || (!_grabberName.startsWith("V4L") && !GRABBER_SYS_CLIENTS.empty()))
{
if(!_grabberName.startsWith("V4L") && !GRABBER_SYS_CLIENTS.empty() && getSysGrabberState())
start();
}
}
QStringList GrabberWrapper::getV4L2devices() const
{
if(_grabberName.startsWith("V4L"))
return _ggrabber->getV4L2devices();
return QStringList();
}
QString GrabberWrapper::getV4L2deviceName(const QString& devicePath) const
{
if(_grabberName.startsWith("V4L"))
return _ggrabber->getV4L2deviceName(devicePath);
return QString();
}
QMultiMap<QString, int> GrabberWrapper::getV4L2deviceInputs(const QString& devicePath) const
{
if(_grabberName.startsWith("V4L"))
return _ggrabber->getV4L2deviceInputs(devicePath);
return QMultiMap<QString, int>();
}
QStringList GrabberWrapper::getResolutions(const QString& devicePath) const
{
if(_grabberName.startsWith("V4L"))
return _ggrabber->getResolutions(devicePath);
return QStringList();
}
QStringList GrabberWrapper::getFramerates(const QString& devicePath) const
{
if(_grabberName.startsWith("V4L"))
return _ggrabber->getFramerates(devicePath);
return QStringList();
}

View File

@@ -56,7 +56,7 @@ Hyperion::Hyperion(quint8 instance, bool readonlyMode)
, _hwLedCount()
, _ledGridSize(hyperion::getLedLayoutGridSize(getSetting(settings::LEDS).array()))
, _BGEffectHandler(nullptr)
,_captureCont(nullptr)
, _captureCont(nullptr)
, _ledBuffer(_ledString.leds().size(), ColorRgb::BLACK)
, _boblightServer(nullptr)
, _readOnlyMode(readonlyMode)

View File

@@ -88,7 +88,6 @@ bool HyperionIManager::startInstance(quint8 inst, bool block, QObject* caller, i
// from Hyperion
connect(hyperion, &Hyperion::settingsChanged, this, &HyperionIManager::settingsChanged);
connect(hyperion, &Hyperion::videoMode, this, &HyperionIManager::requestVideoMode);
connect(hyperion, &Hyperion::compStateChangeRequest, this, &HyperionIManager::compStateChangeRequest);
// to Hyperion
connect(this, &HyperionIManager::newVideoMode, hyperion, &Hyperion::newVideoMode);

View File

@@ -14,14 +14,17 @@
const int PriorityMuxer::FG_PRIORITY = 1;
const int PriorityMuxer::BG_PRIORITY = 254;
const int PriorityMuxer::MANUAL_SELECTED_PRIORITY = 256;
const int PriorityMuxer::LOWEST_PRIORITY = std::numeric_limits<uint8_t>::max();
const int PriorityMuxer::TIMEOUT_NOT_ACTIVE_PRIO = -100;
PriorityMuxer::PriorityMuxer(int ledCount, QObject * parent)
: QObject(parent)
, _log(Logger::getInstance("HYPERION"))
, _currentPriority(PriorityMuxer::LOWEST_PRIORITY)
, _previousPriority(_currentPriority)
, _manualSelectedPriority(256)
, _manualSelectedPriority(MANUAL_SELECTED_PRIORITY)
, _prevVisComp (hyperion::Components::COMP_COLOR)
, _activeInputs()
, _lowestPriorityInfo()
, _sourceAutoSelectEnabled(true)
@@ -101,7 +104,7 @@ void PriorityMuxer::updateLedColorsLength(int ledCount)
{
for (auto infoIt = _activeInputs.begin(); infoIt != _activeInputs.end();)
{
if (infoIt->ledColors.size() >= 1)
if (!infoIt->ledColors.empty())
{
infoIt->ledColors.resize(ledCount, infoIt->ledColors.at(0));
}
@@ -151,7 +154,7 @@ void PriorityMuxer::registerInput(int priority, hyperion::Components component,
InputInfo& input = _activeInputs[priority];
input.priority = priority;
input.timeoutTime_ms = newInput ? -100 : input.timeoutTime_ms;
input.timeoutTime_ms = newInput ? TIMEOUT_NOT_ACTIVE_PRIO : input.timeoutTime_ms;
input.componentId = component;
input.origin = origin;
input.smooth_cfg = smooth_cfg;
@@ -162,7 +165,9 @@ void PriorityMuxer::registerInput(int priority, hyperion::Components component,
Debug(_log,"Register new input '%s/%s' with priority %d as inactive", QSTRING_CSTR(origin), hyperion::componentToIdString(component), priority);
// emit 'prioritiesChanged' only if _sourceAutoSelectEnabled is false
if (!_sourceAutoSelectEnabled)
{
emit prioritiesChanged();
}
return;
}
@@ -180,19 +185,26 @@ bool PriorityMuxer::setInput(int priority, const std::vector<ColorRgb>& ledColor
return false;
}
// calc final timeout
if(timeout_ms > 0)
timeout_ms = QDateTime::currentMSecsSinceEpoch() + timeout_ms;
InputInfo& input = _activeInputs[priority];
InputInfo& input = _activeInputs[priority];
// detect active <-> inactive changes
bool activeChange = false;
bool active = true;
if(input.timeoutTime_ms == -100 && timeout_ms != -100)
// calculate final timeout
if (timeout_ms >= 0)
{
timeout_ms = QDateTime::currentMSecsSinceEpoch() + timeout_ms;
}
else if (input.timeoutTime_ms >= 0)
{
timeout_ms = QDateTime::currentMSecsSinceEpoch();
}
if(input.timeoutTime_ms == TIMEOUT_NOT_ACTIVE_PRIO && timeout_ms != TIMEOUT_NOT_ACTIVE_PRIO)
{
activeChange = true;
}
else if(timeout_ms == -100 && input.timeoutTime_ms != -100)
else if(timeout_ms == TIMEOUT_NOT_ACTIVE_PRIO && input.timeoutTime_ms != TIMEOUT_NOT_ACTIVE_PRIO)
{
active = false;
activeChange = true;
@@ -224,19 +236,26 @@ bool PriorityMuxer::setInputImage(int priority, const Image<ColorRgb>& image, in
return false;
}
// calculate final timeout
if(timeout_ms > 0)
timeout_ms = QDateTime::currentMSecsSinceEpoch() + timeout_ms;
InputInfo& input = _activeInputs[priority];
InputInfo& input = _activeInputs[priority];
// detect active <-> inactive changes
bool activeChange = false;
bool active = true;
if(input.timeoutTime_ms == -100 && timeout_ms != -100)
// calculate final timeout
if (timeout_ms >= 0)
{
timeout_ms = QDateTime::currentMSecsSinceEpoch() + timeout_ms;
}
else if (input.timeoutTime_ms >= 0)
{
timeout_ms = QDateTime::currentMSecsSinceEpoch();
}
if(input.timeoutTime_ms == TIMEOUT_NOT_ACTIVE_PRIO && timeout_ms != TIMEOUT_NOT_ACTIVE_PRIO)
{
activeChange = true;
}
else if(timeout_ms == -100 && input.timeoutTime_ms != -100)
else if(timeout_ms == TIMEOUT_NOT_ACTIVE_PRIO && input.timeoutTime_ms != TIMEOUT_NOT_ACTIVE_PRIO)
{
active = false;
activeChange = true;
@@ -251,7 +270,9 @@ bool PriorityMuxer::setInputImage(int priority, const Image<ColorRgb>& image, in
{
Debug(_log, "Priority %d is now %s", priority, active ? "active" : "inactive");
if (_currentPriority < priority)
{
emit prioritiesChanged();
}
setCurrentTime();
}
@@ -261,12 +282,12 @@ bool PriorityMuxer::setInputImage(int priority, const Image<ColorRgb>& image, in
bool PriorityMuxer::setInputInactive(int priority)
{
Image<ColorRgb> image;
return setInputImage(priority, image, -100);
return setInputImage(priority, image, TIMEOUT_NOT_ACTIVE_PRIO);
}
bool PriorityMuxer::clearInput(int priority)
{
if (priority < PriorityMuxer::LOWEST_PRIORITY && _activeInputs.remove(priority))
if (priority < PriorityMuxer::LOWEST_PRIORITY && (_activeInputs.remove(priority) > 0))
{
Debug(_log,"Removed source priority %d",priority);
// on clear success update _currentPriority
@@ -318,14 +339,15 @@ void PriorityMuxer::setCurrentTime()
}
else
{
// timeoutTime of -100 is awaiting data (inactive); skip
if(infoIt->timeoutTime_ms > -100)
// timeoutTime of TIMEOUT_NOT_ACTIVE_PRIO is awaiting data (inactive); skip
if(infoIt->timeoutTime_ms > TIMEOUT_NOT_ACTIVE_PRIO)
newPriority = qMin(newPriority, infoIt->priority);
// call timeTrigger when effect or color is running with timeout > 0, blacklist prio 255
if(infoIt->priority < BG_PRIORITY && infoIt->timeoutTime_ms > 0 && (infoIt->componentId == hyperion::COMP_EFFECT || infoIt->componentId == hyperion::COMP_COLOR || infoIt->componentId == hyperion::COMP_IMAGE))
if (infoIt->priority < BG_PRIORITY && infoIt->timeoutTime_ms > 0 && (infoIt->componentId == hyperion::COMP_EFFECT || infoIt->componentId == hyperion::COMP_COLOR || infoIt->componentId == hyperion::COMP_IMAGE))
{
emit signalTimeTrigger(); // as signal to prevent Threading issues
}
++infoIt;
}
}

View File

@@ -391,12 +391,67 @@ bool SettingsManager::handleConfigUpgrade(QJsonObject& config)
Warning(_log, "Instance [%u]: HwLedCount/Layout mismatch! Setting Hardware LED count to number of LEDs configured via layout", _instance);
hwLedcount = layoutLedCount;
newDeviceConfig["hardwareLedCount"] = hwLedcount;
config["device"] = newDeviceConfig;
migrated = true;
}
}
}
if (newDeviceConfig.contains("type"))
{
QString type = newDeviceConfig["type"].toString();
if (type == "atmoorb" || type == "fadecandy" || type == "philipshue" )
{
if (newDeviceConfig.contains("output"))
{
newDeviceConfig["host"] = newDeviceConfig["output"].toString();
newDeviceConfig.remove("output");
migrated = true;
}
}
}
if (migrated)
{
config["device"] = newDeviceConfig;
Debug(_log, "LED-Device records migrated");
}
}
if (config.contains("grabberV4L2"))
{
QJsonObject newGrabberV4L2Config = config["grabberV4L2"].toObject();
if (newGrabberV4L2Config.contains("encoding_format"))
{
newGrabberV4L2Config.remove("encoding_format");
config["grabberV4L2"] = newGrabberV4L2Config;
migrated = true;
Debug(_log, "GrabberV4L2 records migrated");
}
}
if (config.contains("framegrabber"))
{
QJsonObject newFramegrabberConfig = config["framegrabber"].toObject();
//Align element namings with grabberV4L2
//Rename element type -> device
if (newFramegrabberConfig.contains("type"))
{
newFramegrabberConfig["device"] = newFramegrabberConfig["type"];
newFramegrabberConfig.remove("type");
migrated = true;
}
//Rename element frequency_Hz -> fps
if (newFramegrabberConfig.contains("frequency_Hz"))
{
newFramegrabberConfig["fps"] = newFramegrabberConfig["frequency_Hz"];
newFramegrabberConfig.remove("frequency_Hz");
migrated = true;
}
config["framegrabber"] = newFramegrabberConfig;
Debug(_log, "Framegrabber records migrated");
}
}
}

View File

@@ -1,7 +1,6 @@
<RCC>
<qresource prefix="/">
<file alias="hyperion-schema">hyperion.schema.json</file>
<file alias="hyperion_default.config">../../config/hyperion.config.json.default</file>
<file alias="schema-general.json">schema/schema-general.json</file>
<file alias="schema-logger.json">schema/schema-logger.json</file>
<file alias="schema-device.json">schema/schema-device.json</file>

View File

@@ -1,100 +1,142 @@
{
"type" : "object",
"title" : "edt_conf_fg_heading_title",
"properties" :
"properties":
{
"type" :
{
"type" : "string",
"title" : "edt_conf_fg_type_title",
"enum" : ["auto","amlogic","dispmanx","dx","framebuffer","osx","qt","x11", "xcb"],
"options":
{
"enum_titles": ["edt_conf_enum_automatic","AMLogic","DispmanX","DirectX9","Framebuffer","OSX","QT","X11","XCB"]
"enable": {
"type": "boolean",
"title": "edt_conf_general_enable_title",
"required": true,
"default": false,
"propertyOrder": 1
},
"available_devices": {
"type": "string",
"title": "edt_conf_grabber_discovered_title",
"default": "edt_conf_grabber_discovery_inprogress",
"options": {
"infoText": "edt_conf_grabber_discovered_title_info"
},
"default" : "auto",
"propertyOrder" : 1
"propertyOrder": 2,
"required": false
},
"width" :
{
"type" : "integer",
"title" : "edt_conf_fg_width_title",
"minimum" : 10,
"default" : 80,
"append" : "edt_append_pixel",
"propertyOrder" : 2
"device": {
"type": "string",
"title": "edt_conf_enum_custom",
"options": {
"hidden": true
},
"required": true,
"comment": "The 'available_devices' settings are dynamically inserted into the WebUI under PropertyOrder '2'.",
"propertyOrder": 3
},
"height" :
{
"type" : "integer",
"title" : "edt_conf_fg_height_title",
"minimum" : 10,
"default" : 45,
"append" : "edt_append_pixel",
"propertyOrder" : 3
"device_inputs": {
"type": "string",
"title": "edt_conf_v4l2_input_title",
"propertyOrder": 4,
"required": false
},
"frequency_Hz" :
{
"type" : "integer",
"title" : "edt_conf_fg_frequency_Hz_title",
"minimum" : 1,
"default" : 10,
"append" : "edt_append_hz",
"propertyOrder" : 4
"input": {
"type": "integer",
"title": "edt_conf_enum_custom",
"minimum": 0,
"default": 0,
"options": {
"hidden": true
},
"required": true,
"propertyOrder": 5,
"comment": "The 'device_inputs' settings are dynamically inserted into the WebUI under PropertyOrder '4'."
},
"cropLeft" :
{
"type" : "integer",
"title" : "edt_conf_v4l2_cropLeft_title",
"minimum" : 0,
"default" : 0,
"append" : "edt_append_pixel",
"propertyOrder" : 5
"resolutions": {
"type": "string",
"title": "edt_conf_v4l2_resolution_title",
"propertyOrder": 6,
"required": false
},
"cropRight" :
{
"type" : "integer",
"title" : "edt_conf_v4l2_cropRight_title",
"minimum" : 0,
"default" : 0,
"append" : "edt_append_pixel",
"propertyOrder" : 6
"width": {
"type": "integer",
"title": "edt_conf_enum_custom",
"minimum": 10,
"default": 80,
"append": "edt_append_pixel",
"options": {
"hidden": true
},
"required": true,
"propertyOrder": 9,
"comment": "The 'resolutions' settings are dynamically inserted into the WebUI under PropertyOrder '6'."
},
"cropTop" :
{
"type" : "integer",
"title" : "edt_conf_v4l2_cropTop_title",
"minimum" : 0,
"default" : 0,
"append" : "edt_append_pixel",
"propertyOrder" : 7
"height": {
"type": "integer",
"title": "edt_conf_enum_custom",
"append": "edt_append_pixel",
"options": {
"hidden": true
},
"required": true,
"propertyOrder": 10,
"comment": "The 'resolutions' settings are dynamically inserted into the WebUI under PropertyOrder '6'."
},
"cropBottom" :
{
"type" : "integer",
"title" : "edt_conf_v4l2_cropBottom_title",
"minimum" : 0,
"default" : 0,
"append" : "edt_append_pixel",
"propertyOrder" : 8
"framerates": {
"type": "string",
"title": "edt_conf_fg_frequency_Hz_title",
"propertyOrder": 11,
"required": false
},
"pixelDecimation" :
{
"type" : "integer",
"title" : "edt_conf_fg_pixelDecimation_title",
"minimum" : 1,
"maximum" : 30,
"default" : 8,
"propertyOrder" : 9
"fps": {
"type": "integer",
"title": "edt_conf_enum_custom",
"default":10,
"minimum": 1,
"append": "fps",
"options": {
"hidden": true
},
"required": true,
"propertyOrder": 12,
"comment": "The 'framerates' setting is dynamically inserted into the WebUI under PropertyOrder '11'."
},
"display" :
{
"type" : "integer",
"title" : "edt_conf_fg_display_title",
"minimum" : 0,
"default" : 0,
"propertyOrder" : 10
"pixelDecimation": {
"type": "integer",
"title": "edt_conf_fg_pixelDecimation_title",
"minimum": 1,
"maximum": 30,
"default": 8,
"required": true,
"propertyOrder": 13
},
"cropLeft": {
"type": "integer",
"title": "edt_conf_v4l2_cropLeft_title",
"minimum": 0,
"default": 0,
"append": "edt_append_pixel",
"propertyOrder": 14
},
"cropRight": {
"type": "integer",
"title": "edt_conf_v4l2_cropRight_title",
"minimum": 0,
"default": 0,
"append": "edt_append_pixel",
"propertyOrder": 15
},
"cropTop": {
"type": "integer",
"title": "edt_conf_v4l2_cropTop_title",
"minimum": 0,
"default": 0,
"append": "edt_append_pixel",
"propertyOrder": 16
},
"cropBottom": {
"type": "integer",
"title": "edt_conf_v4l2_cropBottom_title",
"minimum": 0,
"default": 0,
"append": "edt_append_pixel",
"propertyOrder": 17
}
},
"additionalProperties" : false

View File

@@ -2,263 +2,359 @@
"type" : "object",
"required" : true,
"title" : "edt_conf_v4l2_heading_title",
"properties" :
"properties":
{
"device" :
{
"type" : "string",
"title" : "edt_conf_enum_custom",
"default" : "auto",
"options" : {
"hidden":true
"enable": {
"type": "boolean",
"title": "edt_conf_general_enable_title",
"required": true,
"default": false,
"propertyOrder": 1
},
"available_devices": {
"type": "string",
"title": "edt_conf_grabber_discovered_title",
"default": "edt_conf_grabber_discovery_inprogress",
"options": {
"infoText": "edt_conf_grabber_discovered_title_info"
},
"required" : true,
"propertyOrder" : 2,
"comment" : "The 'available_devices' settings are dynamically inserted into the WebUI under PropertyOrder '1'."
"propertyOrder": 2,
"required": false
},
"input" :
{
"type" : "integer",
"title" : "edt_conf_enum_custom",
"default" : 0,
"options" : {
"hidden":true
"device": {
"type": "string",
"title": "edt_conf_enum_custom",
"options": {
"hidden": true
},
"required" : true,
"propertyOrder" : 4,
"comment" : "The 'device_inputs' settings are dynamically inserted into the WebUI under PropertyOrder '3'."
"required": true,
"comment": "The 'available_devices' settings are dynamically inserted into the WebUI under PropertyOrder '2'.",
"propertyOrder": 3
},
"standard" :
{
"type" : "string",
"title" : "edt_conf_v4l2_standard_title",
"enum" : ["NO_CHANGE", "PAL","NTSC","SECAM"],
"default" : "NO_CHANGE",
"options" : {
"enum_titles" : ["edt_conf_enum_NO_CHANGE", "edt_conf_enum_PAL", "edt_conf_enum_NTSC", "edt_conf_enum_SECAM"]
"device_inputs": {
"type": "string",
"title": "edt_conf_v4l2_input_title",
"propertyOrder": 4,
"required": false
},
"input": {
"type": "integer",
"title": "edt_conf_enum_custom",
"default": 0,
"options": {
"hidden": true
},
"required" : true,
"propertyOrder" : 5
"required": true,
"propertyOrder": 5,
"comment": "The 'device_inputs' settings are dynamically inserted into the WebUI under PropertyOrder '4'."
},
"width" :
{
"type" : "integer",
"title" : "edt_conf_fg_width_title",
"default" : 0,
"minimum" : 0,
"append" : "edt_append_pixel",
"options" : {
"hidden":true
"standard": {
"type": "string",
"title": "edt_conf_v4l2_standard_title",
"required": false,
"propertyOrder": 6
},
"encoding": {
"type": "string",
"title": "edt_conf_v4l2_encoding_title",
"required": false,
"access": "advanced",
"propertyOrder": 7
},
"resolutions": {
"type": "string",
"title": "edt_conf_v4l2_resolution_title",
"propertyOrder": 8,
"required": false
},
"width": {
"type": "integer",
"title": "edt_conf_fg_width_title",
"default": 0,
"minimum": 0,
"append": "edt_append_pixel",
"options": {
"hidden": true
},
"required" : true,
"propertyOrder" : 7,
"comment" : "The 'resolutions' settings are dynamically inserted into the WebUI under PropertyOrder '6'."
"required": true,
"propertyOrder": 9,
"comment": "The 'resolutions' settings are dynamically inserted into the WebUI under PropertyOrder '8'."
},
"height" :
{
"type" : "integer",
"title" : "edt_conf_fg_height_title",
"default" : 0,
"minimum" : 0,
"append" : "edt_append_pixel",
"options" : {
"hidden":true
"height": {
"type": "integer",
"title": "edt_conf_fg_height_title",
"default": 0,
"minimum": 0,
"append": "edt_append_pixel",
"options": {
"hidden": true
},
"required" : true,
"propertyOrder" : 8
"required": true,
"propertyOrder": 10,
"comment": "The 'resolutions' settings are dynamically inserted into the WebUI under PropertyOrder '8'."
},
"fps" :
{
"type" : "integer",
"title" : "edt_conf_enum_custom",
"default" : 15,
"minimum" : 1,
"append" : "fps",
"options" : {
"hidden":true
"framerates": {
"type": "string",
"title": "edt_conf_v4l2_framerate_title",
"propertyOrder": 11,
"required": false
},
"fps": {
"type": "integer",
"title": "edt_conf_enum_custom",
"default": 15,
"minimum": 0,
"append": "fps",
"options": {
"hidden": true
},
"required" : true,
"propertyOrder" : 10,
"comment" : "The 'framerates' setting is dynamically inserted into the WebUI under PropertyOrder '9'."
"required": true,
"propertyOrder": 12,
"comment": "The 'framerates' setting is dynamically inserted into the WebUI under PropertyOrder '11'."
},
"sizeDecimation" :
{
"type" : "integer",
"title" : "edt_conf_v4l2_sizeDecimation_title",
"minimum" : 1,
"maximum" : 30,
"default" : 6,
"required" : true,
"propertyOrder" : 11
"fpsSoftwareDecimation": {
"type": "integer",
"title": "edt_conf_v4l2_fpsSoftwareDecimation_title",
"minimum": 0,
"maximum": 60,
"default": 0,
"required": true,
"access": "expert",
"propertyOrder": 13
},
"cropLeft" :
{
"type" : "integer",
"title" : "edt_conf_v4l2_cropLeft_title",
"minimum" : 0,
"default" : 0,
"append" : "edt_append_pixel",
"required" : true,
"propertyOrder" : 12
"flip": {
"type": "string",
"title": "edt_conf_v4l2_flip_title",
"enum": [ "NO_CHANGE", "HORIZONTAL", "VERTICAL", "BOTH" ],
"default": "NO_CHANGE",
"options": {
"enum_titles": [ "edt_conf_enum_NO_CHANGE", "edt_conf_enum_HORIZONTAL", "edt_conf_enum_VERTICAL", "edt_conf_enum_BOTH" ]
},
"required": true,
"access": "advanced",
"propertyOrder": 14
},
"cropRight" :
{
"type" : "integer",
"title" : "edt_conf_v4l2_cropRight_title",
"minimum" : 0,
"default" : 0,
"append" : "edt_append_pixel",
"required" : true,
"propertyOrder" : 13
"sizeDecimation": {
"type": "integer",
"title": "edt_conf_v4l2_sizeDecimation_title",
"minimum": 1,
"maximum": 30,
"default": 8,
"required": true,
"propertyOrder": 15
},
"cropTop" :
{
"type" : "integer",
"title" : "edt_conf_v4l2_cropTop_title",
"minimum" : 0,
"default" : 0,
"append" : "edt_append_pixel",
"required" : true,
"propertyOrder" : 14
"hardware_brightness": {
"type": "integer",
"title": "edt_conf_v4l2_hardware_brightness_title",
"default": 0,
"required": true,
"access": "expert",
"propertyOrder": 16
},
"cropBottom" :
{
"type" : "integer",
"title" : "edt_conf_v4l2_cropBottom_title",
"minimum" : 0,
"default" : 0,
"append" : "edt_append_pixel",
"required" : true,
"propertyOrder" : 15
"hardware_contrast": {
"type": "integer",
"title": "edt_conf_v4l2_hardware_contrast_title",
"default": 0,
"required": true,
"access": "expert",
"propertyOrder": 17
},
"cecDetection" :
{
"type" : "boolean",
"title" : "edt_conf_v4l2_cecDetection_title",
"default" : false,
"required" : true,
"propertyOrder" : 16
"hardware_saturation": {
"type": "integer",
"title": "edt_conf_v4l2_hardware_saturation_title",
"default": 0,
"required": true,
"access": "expert",
"propertyOrder": 18
},
"signalDetection" :
{
"type" : "boolean",
"title" : "edt_conf_v4l2_signalDetection_title",
"default" : false,
"required" : true,
"propertyOrder" : 17
"hardware_hue": {
"type": "integer",
"title": "edt_conf_v4l2_hardware_hue_title",
"default": 0,
"required": true,
"access": "expert",
"propertyOrder": 19
},
"redSignalThreshold" :
{
"type" : "integer",
"title" : "edt_conf_v4l2_redSignalThreshold_title",
"minimum" : 0,
"maximum" : 100,
"default" : 5,
"append" : "edt_append_percent",
"cropLeft": {
"type": "integer",
"title": "edt_conf_v4l2_cropLeft_title",
"minimum": 0,
"default": 0,
"append": "edt_append_pixel",
"required": true,
"propertyOrder": 20
},
"cropRight": {
"type": "integer",
"title": "edt_conf_v4l2_cropRight_title",
"minimum": 0,
"default": 0,
"append": "edt_append_pixel",
"required": true,
"propertyOrder": 21
},
"cropTop": {
"type": "integer",
"title": "edt_conf_v4l2_cropTop_title",
"minimum": 0,
"default": 0,
"append": "edt_append_pixel",
"required": true,
"propertyOrder": 22
},
"cropBottom": {
"type": "integer",
"title": "edt_conf_v4l2_cropBottom_title",
"minimum": 0,
"default": 0,
"append": "edt_append_pixel",
"required": true,
"propertyOrder": 23
},
"cecDetection": {
"type": "boolean",
"title": "edt_conf_v4l2_cecDetection_title",
"default": false,
"required": true,
"access": "advanced",
"propertyOrder": 24
},
"signalDetection": {
"type": "boolean",
"title": "edt_conf_v4l2_signalDetection_title",
"default": false,
"required": true,
"access": "expert",
"propertyOrder": 25
},
"redSignalThreshold": {
"type": "integer",
"title": "edt_conf_v4l2_redSignalThreshold_title",
"minimum": 0,
"maximum": 100,
"default": 0,
"append": "edt_append_percent",
"options": {
"dependencies": {
"signalDetection": true
}
},
"required" : true,
"propertyOrder" : 18
"access": "expert",
"required": true,
"propertyOrder": 26
},
"greenSignalThreshold" :
{
"type" : "integer",
"title" : "edt_conf_v4l2_greenSignalThreshold_title",
"minimum" : 0,
"maximum" : 100,
"default" : 5,
"append" : "edt_append_percent",
"greenSignalThreshold": {
"type": "integer",
"title": "edt_conf_v4l2_greenSignalThreshold_title",
"minimum": 0,
"maximum": 100,
"default": 100,
"append": "edt_append_percent",
"options": {
"dependencies": {
"signalDetection": true
}
},
"required" : true,
"propertyOrder" : 19
"required": true,
"access": "expert",
"propertyOrder": 27
},
"blueSignalThreshold" :
{
"type" : "integer",
"title" : "edt_conf_v4l2_blueSignalThreshold_title",
"minimum" : 0,
"maximum" : 100,
"default" : 5,
"append" : "edt_append_percent",
"blueSignalThreshold": {
"type": "integer",
"title": "edt_conf_v4l2_blueSignalThreshold_title",
"minimum": 0,
"maximum": 100,
"default": 0,
"append": "edt_append_percent",
"options": {
"dependencies": {
"signalDetection": true
}
},
"required" : true,
"propertyOrder" : 20
"required": true,
"access": "expert",
"propertyOrder": 28
},
"sDVOffsetMin" :
{
"type" : "number",
"title" : "edt_conf_v4l2_sDVOffsetMin_title",
"minimum" : 0.0,
"maximum" : 1.0,
"default" : 0.25,
"step" : 0.01,
"noSignalCounterThreshold": {
"type": "integer",
"title": "edt_conf_v4l2_noSignalCounterThreshold_title",
"minimum": 1,
"maximum": 1000,
"default": 200,
"options": {
"dependencies": {
"signalDetection": true
}
},
"required" : true,
"propertyOrder" : 21
"required": true,
"access": "expert",
"propertyOrder": 29
},
"sDVOffsetMax" :
{
"type" : "number",
"title" : "edt_conf_v4l2_sDVOffsetMax_title",
"minimum" : 0.0,
"maximum" : 1.0,
"default" : 0.75,
"step" : 0.01,
"sDVOffsetMin": {
"type": "number",
"title": "edt_conf_v4l2_sDVOffsetMin_title",
"minimum": 0.0,
"maximum": 1.0,
"default": 0.1,
"step": 0.01,
"options": {
"dependencies": {
"signalDetection": true
}
},
"required" : true,
"propertyOrder" : 22
"required": true,
"access": "expert",
"propertyOrder": 30
},
"sDHOffsetMin" :
{
"type" : "number",
"title" : "edt_conf_v4l2_sDHOffsetMin_title",
"minimum" : 0.0,
"maximum" : 1.0,
"default" : 0.25,
"step" : 0.01,
"sDVOffsetMax": {
"type": "number",
"title": "edt_conf_v4l2_sDVOffsetMax_title",
"minimum": 0.0,
"maximum": 1.0,
"default": 0.9,
"step": 0.01,
"options": {
"dependencies": {
"signalDetection": true
}
},
"required" : true,
"propertyOrder" : 23
"required": true,
"access": "expert",
"propertyOrder": 31
},
"sDHOffsetMax" :
{
"type" : "number",
"title" : "edt_conf_v4l2_sDHOffsetMax_title",
"minimum" : 0.0,
"maximum" : 1.0,
"default" : 0.75,
"step" : 0.01,
"sDHOffsetMin": {
"type": "number",
"title": "edt_conf_v4l2_sDHOffsetMin_title",
"minimum": 0.0,
"maximum": 1.0,
"default": 0.4,
"step": 0.01,
"options": {
"dependencies": {
"signalDetection": true
}
},
"required" : true,
"propertyOrder" : 24
"required": true,
"access": "expert",
"propertyOrder": 32
},
"sDHOffsetMax": {
"type": "number",
"title": "edt_conf_v4l2_sDHOffsetMax_title",
"minimum": 0.0,
"maximum": 1.0,
"default": 0.46,
"step": 0.01,
"options": {
"dependencies": {
"signalDetection": true
}
},
"required": true,
"access": "expert",
"propertyOrder": 33
}
},
"additionalProperties" : true
}
"additionalProperties": true
}

View File

@@ -2,43 +2,52 @@
"type" : "object",
"required" : true,
"title" : "edt_conf_instC_heading_title",
"properties" :
{
"systemEnable" :
{
"type" : "boolean",
"required" : true,
"title" : "edt_conf_instC_systemEnable_title",
"default" : true,
"propertyOrder" : 1
"properties": {
"systemEnable": {
"type": "boolean",
"required": true,
"title": "edt_conf_instC_systemEnable_title",
"default": false,
"propertyOrder": 1
},
"systemPriority" :
{
"type" : "integer",
"required" : true,
"title" : "edt_conf_general_priority_title",
"minimum" : 100,
"maximum" : 253,
"default" : 250,
"propertyOrder" : 2
"systemGrabberDevice": {
"type": "string",
"required": true,
"title": "edt_conf_instC_screen_grabber_device_title",
"default": "NONE",
"propertyOrder": 2
},
"v4lEnable" :
{
"type" : "boolean",
"required" : true,
"title" : "edt_conf_instC_v4lEnable_title",
"default" : false,
"propertyOrder" : 3
"systemPriority": {
"type": "integer",
"required": true,
"title": "edt_conf_general_priority_title",
"minimum": 100,
"maximum": 253,
"default": 250,
"propertyOrder": 3
},
"v4lPriority" :
{
"type" : "integer",
"required" : true,
"title" : "edt_conf_general_priority_title",
"minimum" : 100,
"maximum" : 253,
"default" : 240,
"propertyOrder" : 4
"v4lEnable": {
"type": "boolean",
"required": true,
"title": "edt_conf_instC_v4lEnable_title",
"default": false,
"propertyOrder": 4
},
"v4lGrabberDevice": {
"type": "string",
"required": true,
"title": "edt_conf_instC_video_grabber_device_title",
"default": "NONE",
"propertyOrder": 5
},
"v4lPriority": {
"type": "integer",
"required": true,
"title": "edt_conf_general_priority_title",
"minimum": 100,
"maximum": 253,
"default": 240,
"propertyOrder": 6
}
},
"additionalProperties" : false

View File

@@ -147,7 +147,6 @@
"ledBlacklist": {
"type": "array",
"title": "conf_leds_layout_blacklist_rules_title",
"minimum": 1,
"uniqueItems": true,
"items": {
"type": "object",

View File

@@ -50,7 +50,7 @@ bool LedDeviceAtmoOrb::init(const QJsonObject &deviceConfig)
if ( LedDevice::init(deviceConfig) )
{
_multicastGroup = deviceConfig["output"].toString(MULTICAST_GROUP_DEFAULT_ADDRESS);
_multicastGroup = deviceConfig["host"].toString(MULTICAST_GROUP_DEFAULT_ADDRESS);
_multiCastGroupPort = static_cast<quint16>(deviceConfig["port"].toInt(MULTICAST_GROUP_DEFAULT_PORT));
_useOrbSmoothing = deviceConfig["useOrbSmoothing"].toBool(false);
_skipSmoothingDiff = deviceConfig["skipSmoothingDiff"].toInt(0);

View File

@@ -55,7 +55,7 @@ bool LedDeviceFadeCandy::init(const QJsonObject& deviceConfig)
}
else
{
_host = deviceConfig["output"].toString("127.0.0.1");
_host = deviceConfig["host"].toString("127.0.0.1");
_port = deviceConfig["port"].toInt(STREAM_DEFAULT_PORT);
//If host not configured the init fails

View File

@@ -12,7 +12,7 @@ namespace {
bool verbose = false;
// Configuration settings
const char CONFIG_ADDRESS[] = "output";
const char CONFIG_ADDRESS[] = "host";
//const char CONFIG_PORT[] = "port";
const char CONFIG_USERNAME[] = "username";
const char CONFIG_CLIENTKEY[] = "clientkey";

View File

@@ -16,11 +16,10 @@
"access": "advanced",
"propertyOrder": 2
},
"output": {
"host": {
"type": "string",
"title": "edt_dev_spec_multicastGroup_title",
"default": "239.255.255.250",
"access": "expert",
"propertyOrder": 3
},
"port": {

View File

@@ -1,110 +1,110 @@
{
"type":"object",
"required":true,
"properties":{
"output" : {
"properties": {
"host": {
"type": "string",
"title":"edt_dev_spec_targetIp_title",
"default" : "127.0.0.1",
"propertyOrder" : 1
"title": "edt_dev_spec_targetIp_title",
"default": "127.0.0.1",
"propertyOrder": 1
},
"port" : {
"port": {
"type": "number",
"title":"edt_dev_spec_port_title",
"title": "edt_dev_spec_port_title",
"default": 7890,
"propertyOrder" : 2
"propertyOrder": 2
},
"latchTime": {
"type": "integer",
"title":"edt_dev_spec_latchtime_title",
"title": "edt_dev_spec_latchtime_title",
"default": 0,
"append" : "edt_append_ms",
"append": "edt_append_ms",
"minimum": 0,
"maximum": 1000,
"access" : "expert",
"propertyOrder" : 3
"access": "expert",
"propertyOrder": 3
},
"setFcConfig": {
"type": "boolean",
"title":"edt_dev_spec_FCsetConfig_title",
"title": "edt_dev_spec_FCsetConfig_title",
"default": false,
"propertyOrder" : 4
"propertyOrder": 4
},
"manualLed": {
"type": "boolean",
"title":"edt_dev_spec_FCmanualControl_title",
"title": "edt_dev_spec_FCmanualControl_title",
"default": false,
"options": {
"dependencies": {
"setFcConfig": true
}
},
"propertyOrder" : 5
"propertyOrder": 5
},
"ledOn": {
"type": "boolean",
"title":"edt_dev_spec_FCledToOn_title",
"title": "edt_dev_spec_FCledToOn_title",
"default": false,
"options": {
"dependencies": {
"setFcConfig": true
}
},
"propertyOrder" : 6
"propertyOrder": 6
},
"interpolation": {
"type": "boolean",
"title":"edt_dev_spec_interpolation_title",
"title": "edt_dev_spec_interpolation_title",
"default": false,
"options": {
"dependencies": {
"setFcConfig": true
}
},
"propertyOrder" : 7
"propertyOrder": 7
},
"dither": {
"type": "boolean",
"title":"edt_dev_spec_dithering_title",
"title": "edt_dev_spec_dithering_title",
"default": false,
"options": {
"dependencies": {
"setFcConfig": true
}
},
"propertyOrder" : 8
"propertyOrder": 8
},
"gamma" : {
"type" : "number",
"title" : "edt_dev_spec_gamma_title",
"gamma": {
"type": "number",
"title": "edt_dev_spec_gamma_title",
"default": 1.0,
"minimum" : 0.1,
"minimum": 0.1,
"maximum": 5.0,
"options": {
"dependencies": {
"setFcConfig": true
}
},
"propertyOrder" : 9
"propertyOrder": 9
},
"whitepoint" : {
"type" : "array",
"title" : "edt_dev_spec_whitepoint_title",
"whitepoint": {
"type": "array",
"title": "edt_dev_spec_whitepoint_title",
"options": {
"dependencies": {
"setFcConfig": true
}
},
"propertyOrder" : 10,
"default" : [255,255,255],
"maxItems" : 3,
"minItems" : 3,
"format" : "colorpicker",
"items" : {
"type" : "integer",
"minimum" : 0,
"propertyOrder": 10,
"default": [ 255, 255, 255 ],
"maxItems": 3,
"minItems": 3,
"format": "colorpicker",
"items": {
"type": "integer",
"minimum": 0,
"maximum": 255,
"default" : 255
"default": 255
}
}
},

View File

@@ -2,7 +2,7 @@
"type": "object",
"required": true,
"properties": {
"output": {
"host": {
"type": "string",
"title": "edt_dev_spec_targetIp_title",
"default": "",

View File

@@ -3,7 +3,6 @@
// qt incl
#include <QDir>
#include <QFileInfo>
#include <QDebug>
// hyperion include
#include <hyperion/Hyperion.h>

View File

@@ -3,30 +3,17 @@
#include <utils/Logger.h>
ImageResampler::ImageResampler()
: _horizontalDecimation(1)
, _verticalDecimation(1)
: _horizontalDecimation(8)
, _verticalDecimation(8)
, _cropLeft(0)
, _cropRight(0)
, _cropTop(0)
, _cropBottom(0)
, _videoMode(VideoMode::VIDEO_2D)
, _flipMode(FlipMode::NO_CHANGE)
{
}
ImageResampler::~ImageResampler()
{
}
void ImageResampler::setHorizontalPixelDecimation(int decimator)
{
_horizontalDecimation = decimator;
}
void ImageResampler::setVerticalPixelDecimation(int decimator)
{
_verticalDecimation = decimator;
}
void ImageResampler::setCropping(int cropLeft, int cropRight, int cropTop, int cropBottom)
{
_cropLeft = cropLeft;
@@ -35,15 +22,12 @@ void ImageResampler::setCropping(int cropLeft, int cropRight, int cropTop, int c
_cropBottom = cropBottom;
}
void ImageResampler::setVideoMode(VideoMode mode)
{
_videoMode = mode;
}
void ImageResampler::processImage(const uint8_t * data, int width, int height, int lineLength, PixelFormat pixelFormat, Image<ColorRgb> &outputImage) const
{
int cropRight = _cropRight;
int cropBottom = _cropBottom;
int xDestFlip = 0, yDestFlip = 0;
int uOffset = 0, vOffset = 0;
// handle 3D mode
switch (_videoMode)
@@ -67,11 +51,40 @@ void ImageResampler::processImage(const uint8_t * data, int width, int height, i
for (int yDest = 0, ySource = _cropTop + (_verticalDecimation >> 1); yDest < outputHeight; ySource += _verticalDecimation, ++yDest)
{
int yOffset = lineLength * ySource;
if (pixelFormat == PixelFormat::NV12)
{
uOffset = (height + ySource / 2) * lineLength;
}
else if (pixelFormat == PixelFormat::I420)
{
uOffset = width * height + (ySource/2) * width/2;
vOffset = width * height * 1.25 + (ySource/2) * width/2;
}
for (int xDest = 0, xSource = _cropLeft + (_horizontalDecimation >> 1); xDest < outputWidth; xSource += _horizontalDecimation, ++xDest)
{
ColorRgb & rgb = outputImage(xDest, yDest);
switch (_flipMode)
{
case FlipMode::HORIZONTAL:
xDestFlip = xDest;
yDestFlip = outputHeight-yDest-1;
break;
case FlipMode::VERTICAL:
xDestFlip = outputWidth-xDest-1;
yDestFlip = yDest;
break;
case FlipMode::BOTH:
xDestFlip = outputWidth-xDest-1;
yDestFlip = outputHeight-yDest-1;
break;
case FlipMode::NO_CHANGE:
xDestFlip = xDest;
yDestFlip = yDest;
break;
}
ColorRgb &rgb = outputImage(xDestFlip, yDestFlip);
switch (pixelFormat)
{
case PixelFormat::UYVY:
@@ -124,7 +137,24 @@ void ImageResampler::processImage(const uint8_t * data, int width, int height, i
rgb.red = data[index+2];
}
break;
#ifdef HAVE_JPEG_DECODER
case PixelFormat::NV12:
{
uint8_t y = data[yOffset + xSource];
uint8_t u = data[uOffset + ((xSource >> 1) << 1)];
uint8_t v = data[uOffset + ((xSource >> 1) << 1) + 1];
ColorSys::yuv2rgb(y, u, v, rgb.red, rgb.green, rgb.blue);
}
break;
case PixelFormat::I420:
{
int y = data[yOffset + xSource];
int u = data[uOffset + (xSource >> 1)];
int v = data[vOffset + (xSource >> 1)];
ColorSys::yuv2rgb(y, u, v, rgb.red, rgb.green, rgb.blue);
break;
}
break;
#ifdef HAVE_TURBO_JPEG
case PixelFormat::MJPEG:
break;
#endif