Welcome to mirror list, hosted at ThFree Co, Russian Federation.

github.com/ValveSoftware/openvr.git - Unnamed repository; edit this file 'description' to name the repository.
summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorJoe Ludwig <joe@valvesoftware.com>2018-11-27 19:54:42 +0300
committerJoe Ludwig <joe@valvesoftware.com>2018-11-27 19:54:42 +0300
commit25d2b814afa76e5a482638a78138da47ba177599 (patch)
tree0cf1c1c3ff7d39986bdf04a0dbee70c21ec16f05 /samples
parent1fb1030f2ac238456dca7615a4408fb2bb42afb6 (diff)
OpenVR SDK Update 1.1.3
General: * Added required SteamVR version number to the header file. IVRCompositor: * New VRCompositor_FrameTiming ReprojectionFlag: VRCompositor_ReprojectionMotion. This flag will be set for application frames where motion smoothing was applied at least one of the times it was displayed. * New interface IsMotionSmoothingEnabled added to determine if that user has enabled motion smoothing or not. IVRChaperone: * Added VREvent_ChaperoneFlushCache, which is sent when the application should reload any cached data they loaded from the chaperone API. This event indicates the user's current chaperone settings have changed. * The VREvent_ChaperoneDataHasChanged event will no longer be sent, and IVRChaperone::ReloadInfo no longer has any effect. IVRChaperoneSetup: * Removed some unimplemented functions: * SetWorkingCollisionBoundsTagsInfo * GetLiveCollisionBoundsTagsInfo * SetWorkingPhysicalBoundsInfo * GetLivePhysicalBoundsInfo * Added ShowWorkingSetPreview/HideWorkingSetPreview, which will cause the application's current working set to show as the chaperone bounds rendered by the compositor. Unless your application is modifying the user's chaperone bounds, you won't need to call these functions. They are independent of bounds turning on and off based on how close the user is to them. IVROverlay: * Added flag VROverlayFlags_MakeOverlaysInteractiveIfVisible. If this flag is set on an overlay and that overlay is visible, SteamVR will be placed into laser mouse mode. This will prevent the scene application from receiving any input, so use this flag carefully. * Changed SetOverlayDualAnalogTransform to take a pointer to better support the C API. IVRTrackedCamera: * for headsets (like Vive Pro) which include multiple camera images in a single video stream, GetCameraIntrinsics and GetCameraProjection now support a uint32_t nCameraIndex parameter to get data about the specific camera. IVRInput: * Added an argument to GetOriginLocalizedName to allow the caller to specify which parts of the name they want in the returned string. The possible values are: * VRInputString_Hand - Which hand the origin is in. E.g. "Left Hand" * VRInputString_ControllerType - What kind of controller the user has in that hand. E.g. "Vive Controller" * VRInputString_InputSource - What part of that controller is the origin. E.g. "Trackpad" * VRInputString_All - All of the above. E.g. "Left Hand Vive Controller Trackpad" * Skeletal Input: * Added GetBoneCount to return the number of bones in the skeleton associated with an action * Added GetBoneHierarchy which returns the index of each bone's parent in a user-provided array * Added GetBoneName to retrieve the name of the bones in the skeleton * Added GetSkeletalReferenceTransforms to retrieve the transforms for several specific hand skeleton poses: * Bind Pose * Open Hand * Fist * Grip Limit, which is the shape of the hand when closed around the controller * Added GetSkeletalTrackingLevel to retrieve an estimate of the level of detail with which the controller associated with an action can track actual the movement of the user's body. The levels are: * Estimated: body part location can't be directly determined by the device. Any skeletal pose provided by the device is estimated by assuming the position required to active buttons, triggers, joysticks, or other input sensors. e.g. Vive Controller, Gamepad * Partial: body part location can be measured directly but with fewer degrees of freedom than the actual body part. Certain body part positions may be unmeasured by the device and estimated from other input data. e.g. Knuckles, gloves that only measure finger curl * Full: body part location can be measured directly throughout the entire range of motion of the body part. e.g. Mocap suit for the full body, gloves that measure rotation of each finger segment * Added GetSkeletalSummaryData which returns meta data about the current pose of the hand such as finger curl and splay * Removed ulRestrictToDevice as a parameter from all skeletal input functions Driver API: * Added TrackedControllerRole_Treadmill, which lets a driver specify that a device is intended to function as a treadmill. This opts the device out of hand selection. The new input path /user/treadmill is automatically assigned to the first treadmill device to activate. IVRCameraComponent: * CVS_FORMAT_BAYER16BG for cameras which support delivering raw sensor data * Added camera index to GetCameraDistortion, GetCameraProjection, and GetCameraIntrinsics to support multiple cameras on the same device (see also IVRTrackedCamera) * Added the ability for cameras to return one of a small set of programmatic distortion function types and function parameters in addition to or instead of UV-sampling distortion through GetCameraDistortion. See EVRDistortionFunctionType and IVRCameraComponent::GetCameraIntrinsics and refer to OpenCV camera calibration and undistortion documentation. IVRDriverInput: * Added parameter to CreateSkeletonComponent to allow the driver to specify the skeletal tracking level that the controller supports Samples: * Fixed texture corruption bug with hellovr_vulkan when controller is turned on after starting the application. [git-p4: depot-paths = "//vr/steamvr/sdk_release/": change = 4837234]
Diffstat (limited to 'samples')
-rw-r--r--samples/bin/linux64/libopenvr_api.sobin294808 -> 299168 bytes
-rw-r--r--samples/bin/osx32/libopenvr_api.dylibbin306968 -> 307176 bytes
-rw-r--r--samples/bin/win32/openvr_api.dllbin485664 -> 486176 bytes
-rw-r--r--samples/bin/win64/openvr_api.dllbin592160 -> 593184 bytes
-rw-r--r--samples/hellovr_vulkan/hellovr_vulkan_main.cpp39
5 files changed, 38 insertions, 1 deletions
diff --git a/samples/bin/linux64/libopenvr_api.so b/samples/bin/linux64/libopenvr_api.so
index 92ac9c2..46dac9e 100644
--- a/samples/bin/linux64/libopenvr_api.so
+++ b/samples/bin/linux64/libopenvr_api.so
Binary files differ
diff --git a/samples/bin/osx32/libopenvr_api.dylib b/samples/bin/osx32/libopenvr_api.dylib
index b2877ce..2e59051 100644
--- a/samples/bin/osx32/libopenvr_api.dylib
+++ b/samples/bin/osx32/libopenvr_api.dylib
Binary files differ
diff --git a/samples/bin/win32/openvr_api.dll b/samples/bin/win32/openvr_api.dll
index 0a348d0..c440068 100644
--- a/samples/bin/win32/openvr_api.dll
+++ b/samples/bin/win32/openvr_api.dll
Binary files differ
diff --git a/samples/bin/win64/openvr_api.dll b/samples/bin/win64/openvr_api.dll
index 6988b78..f4ddebd 100644
--- a/samples/bin/win64/openvr_api.dll
+++ b/samples/bin/win64/openvr_api.dll
Binary files differ
diff --git a/samples/hellovr_vulkan/hellovr_vulkan_main.cpp b/samples/hellovr_vulkan/hellovr_vulkan_main.cpp
index b72ce48..07aa245 100644
--- a/samples/hellovr_vulkan/hellovr_vulkan_main.cpp
+++ b/samples/hellovr_vulkan/hellovr_vulkan_main.cpp
@@ -329,7 +329,7 @@ void dprintf( const char *fmt, ... )
char buffer[ 2048 ];
va_start( args, fmt );
- vsprintf_s( buffer, fmt, args );
+ vsnprintf( buffer, sizeof( buffer ), fmt, args );
va_end( args );
if ( g_bPrintf )
@@ -1458,6 +1458,9 @@ bool CMainApplication::BInitVulkan()
vkQueueSubmit( m_pQueue, 1, &submitInfo, m_currentCommandBuffer.m_pFence );
m_commandBuffers.push_front( m_currentCommandBuffer );
+ m_currentCommandBuffer.m_pCommandBuffer = VK_NULL_HANDLE;
+ m_currentCommandBuffer.m_pFence = VK_NULL_HANDLE;
+
// Wait for the GPU before proceeding
vkQueueWaitIdle( m_pQueue );
@@ -1720,6 +1723,9 @@ void CMainApplication::RenderFrame()
// Add the command buffer back for later recycling
m_commandBuffers.push_front( m_currentCommandBuffer );
+ m_currentCommandBuffer.m_pCommandBuffer = VK_NULL_HANDLE;
+ m_currentCommandBuffer.m_pFence = VK_NULL_HANDLE;
+
// Submit to SteamVR
vr::VRTextureBounds_t bounds;
bounds.uMin = 0.0f;
@@ -3303,6 +3309,20 @@ VulkanRenderModel *CMainApplication::FindOrLoadRenderModel( vr::TrackedDeviceInd
m_pDescriptorSets[ DESCRIPTOR_SET_LEFT_EYE_RENDER_MODEL0 + unTrackedDeviceIndex ],
m_pDescriptorSets[ DESCRIPTOR_SET_RIGHT_EYE_RENDER_MODEL0 + unTrackedDeviceIndex ],
};
+
+ // If this gets called during HandleInput() there will be no command buffer current, so create one
+ // and submit it immediately.
+ bool bNewCommandBuffer = false;
+ if ( m_currentCommandBuffer.m_pCommandBuffer == VK_NULL_HANDLE )
+ {
+ m_currentCommandBuffer = GetCommandBuffer();
+
+ // Start the command buffer
+ VkCommandBufferBeginInfo commandBufferBeginInfo = { VK_STRUCTURE_TYPE_COMMAND_BUFFER_BEGIN_INFO };
+ commandBufferBeginInfo.flags = VK_COMMAND_BUFFER_USAGE_ONE_TIME_SUBMIT_BIT;
+ vkBeginCommandBuffer( m_currentCommandBuffer.m_pCommandBuffer, &commandBufferBeginInfo );
+ bNewCommandBuffer = true;
+ }
if ( !pRenderModel->BInit( m_pDevice, m_physicalDeviceMemoryProperties, m_currentCommandBuffer.m_pCommandBuffer, unTrackedDeviceIndex, pDescriptorSets, *pModel, *pTexture ) )
{
dprintf( "Unable to create Vulkan model from render model %s\n", pchRenderModelName );
@@ -3312,6 +3332,23 @@ VulkanRenderModel *CMainApplication::FindOrLoadRenderModel( vr::TrackedDeviceInd
else
{
m_vecRenderModels.push_back( pRenderModel );
+
+ // If this is during HandleInput() there is was no command buffer current, so submit it now.
+ if ( bNewCommandBuffer )
+ {
+ vkEndCommandBuffer( m_currentCommandBuffer.m_pCommandBuffer );
+
+ // Submit now
+ VkSubmitInfo submitInfo = { VK_STRUCTURE_TYPE_SUBMIT_INFO };
+ submitInfo.commandBufferCount = 1;
+ submitInfo.pCommandBuffers = &m_currentCommandBuffer.m_pCommandBuffer;
+ vkQueueSubmit( m_pQueue, 1, &submitInfo, m_currentCommandBuffer.m_pFence );
+ m_commandBuffers.push_front( m_currentCommandBuffer );
+
+ // Reset current command buffer
+ m_currentCommandBuffer.m_pCommandBuffer = VK_NULL_HANDLE;
+ m_currentCommandBuffer.m_pFence = VK_NULL_HANDLE;
+ }
}
vr::VRRenderModels()->FreeRenderModel( pModel );
vr::VRRenderModels()->FreeTexture( pTexture );