• 0 Posts
  • 20 Comments
Joined 1 year ago
cake
Cake day: July 3rd, 2023

help-circle
  • I am not convinced bots would fill the list with hypothetical purchases, I don’t think scalpers are interested in waiting or having money tied up in backorders.

    The point is to eliminate the scalper advantage by ensuring one can buy the product « at some point ». If you need it by Christmas or whatever then you are kind of screwed.

    I remember for the SteamDeck OLED, stock was enabled in waves over at least a month, so even though the first batch was sold out in minutes, there was no rush to refresh the store page to try and finish the transaction before it ran out. This is in direct contrast to (say) the PS5 which sold out in minutes then still wasn’t available anywhere over a year after it launched.

    I don’t really understand how Valve solved the problem, it should have followed the same pattern of being sold out in minutes then scalpers would be the only option for months, but interestingly that’s not what happened.


  • It seems everyone forgot that not too long ago, you could back order something and the store would call/ship whenever they got it.

    Just let me give you my credit card number in exchange for a spot in the waiting list, then the scalpers lose and I get my new launch thing whenever they get around to it. But no, that would be too simple, gotta get the crowd riled up and race for the available units!

    I suppose this could be abused like everything else but it wouldn’t be worse than what we have now with fucking scalpers buying up the little stock that trickles in via automated bots.

    It’s not about getting your fix sooner for the new shiny, sometimes you really need a new GPU to replace the one you’ve had for 5 years! Why should you settle for the previous generation if the new one just came out and you are willing to pay launch MSRP for that privilege (not 2-3x MSRP for scalpers!!).




  • fulg@lemmy.worldtoTechnology@lemmy.worldWhat Ever Happened to Netscape?
    link
    fedilink
    English
    arrow-up
    36
    arrow-down
    10
    ·
    edit-2
    16 days ago

    They became a poster child for why you should never “start over from scratch” even if your current codebase is awful. Because when you do that your competitors keep going, then they have years on your now stale product. Netscape lost all on their own…

    Also: selling a browser? Man, the 90’s where wild.








  • Vulkan and DirectX could already share shaders, because the input for both was already HLSL. The difference is the intermediate representation of the compiled shaders that will now be the same in the future (SPIR-V for both).

    The real winners here are driver programmers at NVIDIA/AMD/Intel, since they will no longer have to develop support for both DXIL and SPIR-V (which are similar in concept but different in implementation). How much of that will be true in practice remains to be seen, but I am hopeful.

    There are tools to analyze, process and transform SPIR-V bytecode already, presumably those will work for DX12 shader model 7 too. It might make performance analysis easier, same with debugging via a tool like RenderDoc that supports SPIR-V but not DXIL.

    As for the overhead of DirectX, with DX12 this is largely not true anymore, both are high performance APIs with comparable overhead (i.e. as little as possible).



  • I remember your previous post, congrats on not giving up.

    Whipping up a script to solve a very specific problem is super satisfying, but I found that anything you write quickly becomes a liability. Debugging Perl can be super difficult, especially when returning to something you wrote a while back.

    Personally I grew tired of the punishment and left it all behind! If I need a quick script I’ll use Python instead, and if it doesn’t work I can use a real debugger to fix it.

    In any case it’s always fun learning new things, I hope this experience ends up being useful to you in the future and you get to easily solve a problem that stumps everyone else involved.

    Cheers!




  • I should have prefaced that I did not actually run this myself, but I did take a note of it, it looked promising. Sorry for the false hope!

    I would expect it to work after a lot of fussing about, and then break at the slightest update. Easier to run it in a VM (which is also not easy in order to get GPU acceleration without dedicating a card to it - I never managed to get Intel GVT-g nor GVT-d to work reliably).


  • fulg@lemmy.worldto3DPrinting@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    11
    ·
    4 months ago

    It looks like Fusion 360 runs fine on Linux these days, I don’t know how reliable that is in practice (I would expect not very much).

    OnShape is a great option if the licensing terms are compatible with what you are doing. They used to have similar licensing terms as Fusion 360 where you could still get paid for your work with a free version (i.e. YouTube) but changed the terms to remove this loophole. Fusion still allows this with the Startup license but of course could change their mind at any time, then you’d be out of luck.

    I dislike the lockdown of Fusion 360 but its mental model works with my own (I can’t “get” SolidWorks and never remember how to do anything). Speaking of SolidWorks, they added a reasonably-priced license for DIY/hobbyists, but it’s the same lockdown as Fusion 360 and still Windows only.

    I’m in the same boat as you, just a hobbyist doing this for my own use, I have no interest in becoming an industrial engineer. For now I will keep using Fusion 360, and when that stops being an option I’ll move on to something else. I can whip out models for my prints easily enough and the 10 documents limit is just an annoyance, not a real limitation.

    At the very least whatever you design in Fusion 360 or OnShape won’t be stuck in there, you can export it out via .step files. You lose design history (if applicable) but not the model itself.


  • To be fair, USB-C didn’t exist when Lightning was introduced, and it was vastly superior to Micro-USB.

    It doesn’t really have any reason to exist now…

    Agreed with your other points though!

    I have an old iPad that I try to reuse for another purpose and all the locks to stop me to keep using it make it such a pain in the butt, when the alternative is simply to enable developer mode on an Android tablet.

    Thankfully I remembered when buying a laptop and skipped the very enticing M-series hardware, because in 5-7 years that thing is a brick destined for the landfill.