Yelp still exists!?
Who is finding a McDonalds location on their phone's map app and then thinking "I'd better cross-check this against Yelp first"!?
Yelp still exists!?
Who is finding a McDonalds location on their phone's map app and then thinking "I'd better cross-check this against Yelp first"!?
1440p for the win!
Single GPU with scripts that run before and after the VM is active to unload the GPU driver modules from the kernel.
I think this was my starting point and I had to do just a few small tweaks to get it right for my setup - i.e. unload and reload the precise set of kernel modules that block GPU passthrough on my machine.
https://gitlab.com/Karuri/vfio
At this point from a user experience p.o.v it's not much different to dual booting, just with a different boot sequence. The main advantage though is that I can have the Windows OS on a small virtual harddrive for ease of backup/clone/restore and have game installs on a dedicated NVME that doesn't need backing up
I've been 100% linux for my daily home computing for over a year now... With one exception... To be honest I didn't even try particularly hard to make gaming work under Linux.
Instead I have a Windows VM - setup with full passthrough access to my GPU and it's own NVME - just for Windows gaming. To my mind now it's in the same category as running console emulation.
As soon as I click shutdown in windows, it pops me straight back into my Linux desktop.
This video of one of the rioters getting repeatedly struck with bricks thrown by his own mates is well worth a watch... Or two... Or three...
If Newey ends up at Ferrari in time to design their 2026 car, Hamilton has either lucked into a second stunning career move or he's known all along...
I guess my point is that it isn't a particularly important part of the design of Wi-Fi - they included it in the very first iteration in 1997 and realised by 1999 they didn't need it. Therefore Wi-Fi would likely have been born regardless of the invention; Bluetooth would not.
Great to recognise this invention.
I was surprised by the choice of 'Mother of Wi-Fi' though - Wi-Fi hasn't used 'frequency hopping' as such since 802.11b was released back in 1999 - so very few people will have ever used frequency-hopping Wi-Fi.
GPS only uses it in some extreme cases I think, but I'm not an expert.
However, Bluetooth absolutely does depend on it to function in most situations, so 'Mother of Bluetooth' might have been more appropriate.
The real question is why did they install a system based on 5.25" floppy disks in 1998 in the first place!?
The 5.25" floppy was surpassed by the 3.5" floppy by 1988 - ten years prior to this systems installation - and by 1998 most new software was being distributed on CD-ROM. So by my reckoning, in 1998 they installed a 'new' system based on hardware that was 1.5 generations out-of-date and haven't updated it in the 26 years since.
I've found all of the tabs on Google have a tendency to go AWOL these days - like the other day I was searching for camera lenses and Google took away the 'Products' (formerly kmown as 'Shopping') tab, even though what I was searching for couldn't have been more obviously a product. Instead, all I could get were super low quality copy-paste blogs vaguely related to the product.
I agree it's good that the article is not hyping up the idea that the world will now definitely be saved by fusion and so we can all therefore go on consuming all the energy we want.
There are still some sloppy things about the article that disappoint me though...
They seem to be implying that 500 TW is obviously much larger than 2.1 MJ... but without knowing how long the 500 TW is required for, this comparison is meaningless.
They imply that using more power than available from the grid is infeasible, but it evidently isn't as they've done it multiple times - presumably by charging up local energy storage and releasing it quickly. Scaling this up is obviously a challenge though.
The weird mix of metric prefixes (mega) and standard numbers (trillions) in a single sentence is a bit triggering - that might just be me though.
Based on how you're observing the loading move from 100% CPU ro 100% GPU, I would suggest that it is "working" to some extent.
I don't have any experience with that GPU, but here's few things to keep in mind with this:
When you use a GPU for video encoding, it's not the case that it's 'accelerating' what you were doing without it. What you're doing is switching from running a software implementation of an HEVC encoder on your CPU to running a hardware implementation of an HEVC encoder on your GPU. Hardware and Software encoders are very different to one another and they won't combine forces; it's one or the other.
Video encoders have literally hundreds of configuration options. How you configure the encoder will have a massive impact on the encoding time. To get results that I'm happy with for archiving usually means encoding at slower than real-time for me on a 5800X CPU; if you're getting over 100fps on your CPU I would guess that you have it setup on some very fast settings - I wouldn't recommend this for anything other than real-time transcoding. Conversely, it's possible you have slower settings configured for your GPU.
Video encoding is very difficult to do "well" in hardware. Generally speaking software is better suited to the sort of algorithms that are needed. GPUs can be beneficial in speeding up an encode, but the result won't be as good in terms of quality vs file size - for the same quality a GPU encode will be bigger, or for the same file size a GPU encode will be lower quality.
I guess this is a roundabout way of suggesting that if you're happy with the quality of your 100fps CPU encodes, stick with it!