2019年5月17日下午，在位于中关村的微软大厦内，微软创新车库（Microsoft Garage）携手微软混合现实事业部，联合小米、爱奇艺VR频道、DataMesh、VRplay等业内知名公司与社区，为160多位到场的XR开发者带来了一场精彩的以开放和分享为基础，共同展望 XR技术和趋势发展的分享交流会。Nikk和文松老师受邀参加分享会，演示了我们的产品，并与到场者交流业内心得。
On the afternoon of May 17th, 2019, Nikk Mitchell and Wilson Lee were invited to attend an XR mixer by Microsoft Beijing and the Microsoft Garage project. The event had guests from big players in the XR industry such as Xiaomi, iQiyi VR, DataMesh, VRplay, and over 160 XR developers. Nikk and Wilson got to share some of FXG’s latest projects and developments, while also learning about what other XR devs are working on.
首先由来自微软的三位嘉宾分别向到场的来宾致辞：微软亚太研发集团创新孵化和车库总监程骉博士，微软创新车库大中华区运营经理王世龙和微软混合现实产品市场经理郑桦。之后迎来了四组XR业界资深嘉宾：DataMesh CEO李劼、DataMeshCTO 邬浩； K-Labs新媒体试验场创始人、VRplay发起人潘博航；爱奇艺VR频道副主编韦骞（大铅笔）；小米VR内容运营负责人卢达晔（Nada），他们分别从各自探索的领域出发，与现场观众分享了他们的XR成长之旅。
The event kicked off with three speeches by Dr. Cheng Biao, Director of Innovation Incubation and Microsoft Garage, Microsoft Asia Pacific R&D Group, Wang Shilong, Operation Manager of Microsoft Innovation Garage Greater China, and Zheng Hua, Marketing Manager of Microsoft Mixed Reality Products. After that, we welcomed four groups of XR industry veterans: DataMesh CEO Li Wei, DataMesh CTO Hao Hao; K-Labs new media test site founder, VRplay promoter Pan Bohang; iQiyi VR channel deputy editor Wei Qian; and Xiaomi’s head of VR content operations Nada. They shared their XR growth journey with the audience from their respective fields of exploration.
During the open discussion session, FXG’s interactive holographic experience got lots of attentionfrom everyone, and FXG’s CTO Wilson Lee had a lengthy technical exchange with the other guests, sharing ideas, suggestions, and answering each other’s questions.
现场互动中，Nikk体验了全新的Azure Kinect。Azure Kinect，是Kinect for Windows的继任者，具有同类最佳的深度传感器、高清4K摄像头和麦克风阵列（7个麦克风），它不仅能看和能听，还能理解人，理解环境、物体和动作。我们听听Nikk的亲测。
The event had a live demo for Microsoft’s new Azure Kinect, Microsoft’s latest revision of its Kinect camera. The new Kinect features the best-in-class depth sensors, high-definition 4K cameras and microphone array (7 microphones). Meaning not only can it listen and watch, but it can understand people, speech, environment, and analyze movements. Here we can see Nikk’s impressions of the new Kinect:
“It is very clearly a HUGE improvement over Kinect 2. The depth data is SO much better. As you can see from the video it was getting my full-face including lips and tongue very clearly, and even my hands. Minimum clipping distance is also improved to I'm guessing around 30ish CM. Body tracking was also a lot better, finally allowing turning. This was with only 1, and I was told Kinect Azures sync up very nicely. Can't wait to try 3 of them!”
Nikk shared his Kinect Azure experience on mm Facebook's largest volumetric video community. It got lots of interest from enthusiasts and professionals alike, and they were excited to see more of FXG's work.
Huge thanks to Microsoft for inviting us to this event, we look forward to integrating this new technology in our solutions and create new exciting experiences. Are you interested in the Azure Kinect? Do you have any questions about it? Leave a comment and maybe Nikk can help answer your questions!
Follow us @
官网 Webisite: fxg.space