• conciselyverbose@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    9 months ago

    Today? Building stuff. The app ecosystem isn’t there for casual audiences yet, because devs need their hands on it to do most stuff that utilizes what it can do. You can’t build much more than the basics using a phone to test.

    You don’t have to use an on screen keyboard. It supports Bluetooth mouse and keyboard perfectly fine. The bigger restriction is the number of windows to me, but there are ways to make that work.

    Interacting with and laying out information in 3D space is just different from doing it on a 2D display. Our brain understands 3D space intuitively in a pretty deep way. There would be some level of “OK, I turned my MacBook display in bed into a 500 foot screen next to a waterfall”, and stuff like the demo 3D videos of animals were super immersive. I did feel like I could reach out and touch them.

    But I want to build out the books in my personal collection into a 3D library with shelves to browse. I want to see stuff I’m modeling in actual 3D instead of one 2D angle at a time I have to manipulate to get different perspectives on. I want to plan out a room layout by standing in the middle of the room and virtually dragging things around. I’m just spitballing a couple of the first things that come to mind, but AR Kit is powerful and capable of all of that with relative ease. I can think of countless other “small” things that change the experience compared to doing stuff on a monitor pretty significantly. I don’t think it’s that different to people saying “you can do whatever on a computer” when iPhones or iPads came out. Sure, but as it gets into more hands and more people are able to build apps for it, people will come up with all kinds of uses that fundamentally feel different even if the same core thing can be done on existing hardware.