醬學(xué)堂|Hololens 開發(fā)入門(hololens開發(fā)教程)
開發(fā)要求
Hololens 運(yùn)行與Win10,應(yīng)用程序是與UWP(通用windows開發(fā)平臺)構(gòu)建的,開發(fā)Hololens 這樣的全息體驗(yàn)對電腦的配置要求也是相當(dāng)高的。
硬件配置:
1.64位Windows 10專業(yè)版,企業(yè)版或教育版(家庭版不支持Hyper-V)
2.64位CPU
3.8GB以上的RAM
4.在BIOS中,必須具備以下功能:
-
硬件輔助虛擬化
二級地址轉(zhuǎn)換(SLAT)
基于硬件的數(shù)據(jù)執(zhí)行保護(hù)(DEP)
5.對于GPU,需DirectX 11.0或更高版本,WDDM 1.2驅(qū)動(dòng)程序或更高版本
關(guān)于Hyper-V,它是微軟的一款虛擬化產(chǎn)品,采用類似Vmware和Citrix開源Xen一樣的基于hypervisor的技術(shù)。
第二部分:安裝
1.啟用虛擬化,即在PC上啟用硬件虛擬化。
詳細(xì)步驟請看:
https://msdn.microsoft.com/library/windows/apps/jj863509(v=vs.105).aspx
2.啟用Hyper-V
3.安裝Visual Studio 2017或Visual Studio 2015 Update3(https://developer.microsoft.com/en-us/windows/downloads)
4.安裝HoloLens emulator(https://developer.microsoft.com/en-us/windows/mixed-reality/hololens_emulator_archive)
5.安裝Unity(https://unity3d.com/cn/get-unity/download)
關(guān)于詳細(xì)的安裝視頻,可以看看老外的這個(gè)教程:
不過不知道什么原因,視頻騰訊過不了所以大家可以在優(yōu)酷看,或者點(diǎn)擊閱讀原文
視頻:
第三部分:關(guān)于Hololens 模擬器
HoloLens模擬器允許你在沒有Hololens的情況下在PC上測試全息應(yīng)用程序,并附帶Hololens開發(fā)工具集。仿真器使用Hyper-V虛擬機(jī)。
關(guān)于輸入:
-
向前,向后,向左和向右走 – 使用鍵盤上的W,A,S和D鍵或Xbox控制器上的左鍵。
查找向上,向下,向左和向右 – 單擊并拖動(dòng)鼠標(biāo),使用鍵盤上的箭頭鍵或Xbox控制器上的右鍵。
空氣敲擊手勢 – 右鍵單擊鼠標(biāo),按鍵盤上的Enter鍵,或使用Xbox控制器上的A按鈕。
綻放手勢 – 按鍵盤上的Windows鍵或F2鍵,或按Xbox控制器上的B按鈕。手動(dòng)移動(dòng)滾動(dòng) – 按住Alt鍵,按住鼠標(biāo)右鍵,向上/向下拖動(dòng)鼠標(biāo),或者在Xbox控制器中按住右側(cè)觸發(fā)器和A按鈕,向上和向下移動(dòng)右側(cè)手柄。
關(guān)于工具欄:
在主窗口的右側(cè),您將找到仿真器工具欄。工具欄包含以下按鈕:
-
關(guān)閉:關(guān)閉模擬器。
最小化:最小化仿真器窗口。
人工輸入:鼠標(biāo)和鍵盤用于模擬模擬器的人工輸入。
鍵盤和鼠標(biāo)輸入:鍵盤和鼠標(biāo)輸入直接傳遞到HoloLens操作系統(tǒng)作為鍵盤和鼠標(biāo)事件,就像連接了藍(lán)牙鍵盤和鼠標(biāo)一樣。
適合屏幕:適合模擬器屏幕。
縮放:使仿真器越來越大。
幫助:打開模擬器幫助。
打開設(shè)備門戶:在仿真器中打開HoloLens OS的Windows設(shè)備門戶。
工具:打開“ 其他工具 ”窗格。
開發(fā)—-Hello,HoloLens!
首先我們在unity中新建一個(gè)項(xiàng)目,接著添加一個(gè)簡單的3D模型進(jìn)行測試,比如:
接著部署Windows Store
接著,點(diǎn)擊Build,生成VS項(xiàng)目:
啟動(dòng)VS:
一般默認(rèn)情況下,從Unity導(dǎo)出的UWP應(yīng)用程序在任何Windows 10設(shè)備上運(yùn)行。由于HoloLens是不同的,應(yīng)用程序應(yīng)該利用僅在HoloLens上可用的功能。為此,您需要在Visual Studio TargetDeviceFamily中的Package.appxmanifest文件中設(shè)置為“Windows.Holographic” ,如下:
接下來,就可以運(yùn)行啦:
第五部分:輸入事件總結(jié)
1
GAZE凝視操作
在Hololens中,使用的是用戶的頭部位置與方向來gaze,而不是眼睛。
示例代碼(PS:核心在于RayCast):
using UnityEngine;
public class WorldCursor : MonoBehaviour
{
private MeshRenderer meshRenderer;
// Use this for initialization
void Start()
{
// Grab the mesh renderer that's on the same object as this script.
meshRenderer = this.gameObject.GetComponentInChildren();
}
// Update is called once per frame
void Update()
{
// Do a raycast into the world based on the user's
// head position and orientation.
var headPosition = Camera.main.transform.position;
var gazeDirection = Camera.main.transform.forward;
RaycastHit hitInfo;
if (Physics.Raycast(headPosition, gazeDirection, out hitInfo))
{
// If the raycast hit a hologram…
// Display the cursor mesh.
meshRenderer.enabled = true;
// Move the cursor to the point where the raycast hit.
this.transform.position = hitInfo.point;
// Rotate the cursor to hug the surface of the hologram.
this.transform.rotation = Quaternion.FromToRotation(Vector3.up, hitInfo.normal);
}
else
{
// If the raycast did not hit a hologram, hide the cursor mesh.
meshRenderer.enabled = false;
}
}
}
2
手勢輸入
示例代碼:
using UnityEngine;
using UnityEngine.VR.WSA.Input;
public class GazeGestureManager : MonoBehaviour
{
public static GazeGestureManager Instance { get; private set; }
// Represents the hologram that is currently being gazed at.
public GameObject FocusedObject { get; private set; }
GestureRecognizer recognizer;
// Use this for initialization
void Start()
{
Instance = this;
// Set up a GestureRecognizer to detect Select gestures.
recognizer = new GestureRecognizer();
recognizer.TappedEvent = (source, tapCount, ray) =>
{
// Send an OnSelect message to the focused object and its ancestors.
if (FocusedObject != null)
{
FocusedObject.SendMessageUpwards(“OnSelect”);
}
};
recognizer.StartCapturingGestures();
}
// Update is called once per frame
void Update()
{
// Figure out which hologram is focused this frame.
GameObject oldFocusObject = FocusedObject;
// Do a raycast into the world based on the user's
// head position and orientation.
var headPosition = Camera.main.transform.position;
var gazeDirection = Camera.main.transform.forward;
RaycastHit hitInfo;
if (Physics.Raycast(headPosition, gazeDirection, out hitInfo))
{
// If the raycast hit a hologram, use that as the focused object.
FocusedObject = hitInfo.collider.gameObject;
}
else
{
// If the raycast did not hit a hologram, clear the focused object.
FocusedObject = null;
}
// If the focused object changed this frame,
// start detecting fresh gestures again.
if (FocusedObject != oldFocusObject)
{
recognizer.CancelGestures();
recognizer.StartCapturingGestures();
}
}
}
Update方法會(huì)持續(xù)檢查是否有任何對象被注視并將對象設(shè)置為焦點(diǎn),以便在點(diǎn)擊時(shí)向?qū)ο蟀l(fā)送一個(gè)輕擊的事件。GestureRecognizer負(fù)責(zé)識別用戶的手勢。
3
語音輸入
示例代碼:
using System.Collections.Generic;
using System.Linq;
using UnityEngine;
using UnityEngine.Windows.Speech;
public class SpeechManager : MonoBehaviour
{
KeywordRecognizer keywordRecognizer = null;
Dictionarykeywords = new Dictionary();
// Use this for initialization
void Start()
{
keywords.Add(“Reset world”, () =>
{
// Call the OnReset method on every descendant object.
this.BroadcastMessage(“OnReset”);
});
keywords.Add(“Drop Object”, () =>
{
var focusObject = GazeGestureManager.Instance.FocusedObject;
if (focusObject != null)
{
// Call the OnDrop method on just the focused object.
focusObject.SendMessage(“OnDrop”);
}
});
// Tell the KeywordRecognizer about our keywords.
keywordRecognizer = new KeywordRecognizer(keywords.Keys.ToArray());
// Register a callback for the KeywordRecognizer and start recognizing!
keywordRecognizer.OnPhraseRecognized = KeywordRecognizer_OnPhraseRecognized;
keywordRecognizer.Start();
}
private void KeywordRecognizer_OnPhraseRecognized(PhraseRecognizedEventArgs args)
{
System.Action keywordAction;
if (keywords.TryGetValue(args.text, out keywordAction))
{
keywordAction.Invoke();
}
}
}
4
.音頻輸入
示例代碼:
using UnityEngine;
public class SphereSounds : MonoBehaviour
{
AudioSource audioSource = null;
AudioClip impactClip = null;
AudioClip rollingClip = null;
bool rolling = false;
void Start()
{
// Add an AudioSource component and set up some defaults
audioSource = gameObject.AddComponent();
audioSource.playOnAwake = false;
audioSource.spatialize = true;
audioSource.spatialBlend = 1.0f;
audioSource.dopplerLevel = 0.0f;
audioSource.rolloffMode = AudioRolloffMode.Custom;
// Load the Sphere sounds from the Resources folder
impactClip = Resources.Load(“Impact”);
rollingClip = Resources.Load(“Rolling”);
}
// Occurs when this object starts colliding with another object
void OnCollisionEnter(Collision collision)
{
// Play an impact sound if the sphere impacts strongly enough.
if (collision.relativeVelocity.magnitude >= 0.1f)
{
audioSource.clip = impactClip;
audioSource.Play();
}
}
// Occurs each frame that this object continues to collide with another object
void OnCollisionStay(Collision collision)
{
Rigidbody rigid = this.gameObject.GetComponent();
// Play a rolling sound if the sphere is rolling fast enough.
if (!rolling && rigid.velocity.magnitude >= 0.01f)
{
rolling = true;
audioSource.clip = rollingClip;
audioSource.Play();
}
// Stop the rolling sound if rolling slows down.
else if (rolling && rigid.velocity.magnitude < 0.01f)
{
rolling = false;
audioSource.Stop();
}
}
// Occurs when this object stops colliding with another object
void OnCollisionExit(Collision collision)
{
// Stop the rolling sound if the object falls off and stops colliding.
if (rolling)
{
rolling = false;
audioSource.Stop();
}
}
}
OnCollisionEnter,OnCollisionStay而OnCollisionExit事件確定何時(shí)開始播放音頻剪輯,是否繼續(xù)音頻剪輯以及何時(shí)停止播放音頻剪輯。
AR醬原創(chuàng),轉(zhuǎn)載務(wù)必注明
微信號AR醬(ARchan_TT)
AR醬官網(wǎng):www.arjiang.com