No bodies business but the user what wallpaper they like.
I use images from the UK canal inferstructure where I spend much of my time.
If you’re willing to tell me to do otherwise. My response is going to be short and rude.
No bodies business but the user what wallpaper they like.
I use images from the UK canal inferstructure where I spend much of my time.
If you’re willing to tell me to do otherwise. My response is going to be short and rude.
Darktable
Thats easy.
Some folks are insane.
And thank fuck for them. Doing dumb stuff like this has led to so much of the useful stuff we see and use now.
“Can I make this dumb idea work” is the very source of inspiration behind science. Never underestimate its value.
–
My relative sanity can be so disapointing to me ;)
High use Blender users tend to avoid AMD for the reasons you point out.
This leads to less updates due to amd users not being to interested in the community.
It is an issuw without any practicle solution. Because as I need a long overdue update. Again nvidia seems the only real choice.
Everyone is sorta forced to do that unless we can convince amd users to just try out blender and submit results.
So hi any AMD users who dont care about blender.
Give it a try and submit performance data please.
Yeah looks very much like nvidia is exclusive at the top even at the price I’m looking at.
The RTX4060 looks about right price vs performance. I’ll spend some time looking up how well they play with linux atm. And keep an eye out for a used RTX4070 as well.
If no one minds my hyjacking part of this thread.
Id also like some similar advice.
I use blender. Not heavily but have been playing on it for 20plus years.
My GPU is pretty old. 1050ti at the time nvidia was pretty much it for blender.
Im looking for a sub £300 card in the next 3 to 6 months.
Is AMD well supported by blender now. And what cards would folks recomend these days.
PS not a gamer. 0ad is about as close as i get.
A valid point. But the result is that over a pretty short period of time. These C developers will find delays in how quickly their code gets accepted into stable branches etc. So will be forced to make clear documentation into how the refactoring effects other elements calling the code. Or move on altogether.
Sorta advantageous to all and a necessary way to proceed when others are using your code.
Because some users are putting that data on Linux. So they want Linux to be killed.
They can’t change grub. But they sure as hell can convince micro$org to search for and nuke it.
Of course no idea if this happened. Just answering why they would might want to.
Cool. At the time, it was one of the best. Although, I also liked sun-os.
I also worked with VMS a lot after uni. Hated using it. But had to respect the ideals behind it.
But watching the growth of Linux has been fantastic. In 2024. It does seem to have out evolved all the others. ( Evolved, defined as developed the ability to survive by becoming so freaking useful. )
I am starting to think it is time for a micro kernel version, though.
Was a few years later for me.
Not DMU by any chance?
Late 1990s my uni had unix workstations HPUX.
So all projects etc were expected to be done on those. Linux at the time was the easy way to do it from home.
By the time I left uni in 98. I was so used to it windows was a pain in the butt.
For most of the time since I have been almost 100% linux. With just a dual boot to sort some hardware/firmware crap.
Ham radio to this day. Many products can only do updates with windows.
Just of the top of my head discovered today.
Not a GUI as one exists. But a more configurable one as it is crap for visually impaired.
Rpi-imager gui dose not take theme indications for font size etc. Worse it has no configuration to change such thing.
Making it pretty much unsuable for anyone with poor vision.
Also it varies for each visually impaired indevidual. But dark mode is essential for some of ua.
So if your looking for small projects. Youd at least make me happy;)
Yep pretty much but on a larger scale.
1st please do not believe the bull that there was no problem. Many folks like me were paid to fix it before it was an issue. So other than a few companies, few saw the result, not because it did not exist. But because we were warned. People make jokes about the over panic. But if that had not happened, it would hav been years to fix, not days. Because without the panic, most corporations would have ignored it. Honestly, the panic scared shareholders. So boards of directors had to get experts to confirm the systems were compliant. And so much dependent crap was found running it was insane.
But the exaggerations of planes falling out of the sky etc. Was also bull. Most systems would have failed but BSOD would be rare, but code would crash and some works with errors shutting it down cleanly, some undiscovered until a short while later. As accounting or other errors showed up.
As other have said. The issue was that since the 1960s, computers were set up to treat years as 2 digits. So had no expectation to handle 2000 other than assume it was 1900. While from the early 90s most systems were built with ways to adapt to it. Not all were, as many were only developing top layer stuff. And many libraries etc had not been checked for this issue. Huge amounts of the infra of the world’s IT ran on legacy systems. Especially in the financial sector where I worked at the time.
The internet was a fairly new thing. So often stuff had been running for decades with no one needing to change it. Or having any real knowledge of how it was coded. So folks like me were forced to hunt through code or often replace systems that were badly documented or more often not at all.
A lot of modern software development practices grew out of discovering what a fucking mess can grow if people accept an “if it ain’t broke, don’t touch it” mentality.
Very much so. But the vulnerabilities do not tend to be discovered (by developers) until an attack happens. And auto updates are generally how the spread of attacks are limited.
Open source can help slightly. Due to both good and bad actors unrelated to development seeing the code. So it is more common for alerts to hit before attacks. But far from a fix all.
But generally, time between discovery and fix is a worry for big corps. So why auto updates have been accepted with less manual intervention than was common in the past.
Not OP. But that is how it used to be done. Issue is the attacks we have seen over the years. IE ransom attacks etc. Have made corps feel they needf to fixed and update instantly to avoid attacks. So they depend on the corp they pay for the software to test roll out.
Autoupdate is a 2 edged sword. Without it, attackers etc will take advantage of delays. With it. Well today.
If the just called it other.
It would gain a huge boost in desktop usage figures.
Thanks. That was exactly what I needed. I’ll look it up.
Blasphemy quick stone the unbelievers.
Kidding of course. Have to admit I agree. I’ve used Linux since the late 1990s. So long long before it was usable by most folks standards.
I started because my university had HPUX machines that we needed to submit work on. So wanted a unix like enviroment at home I could work on. This was a tim when linux was basically slackers on 50plus floppy disks. Xwindows needed configuring for every monitor. Honestly by current standards usability was non existant compared to windows.
But honestly I spent so much time on the system. And watched it improve. To the point I find windows an utter pain in the arse now. And will avoid it under all circumstances.
But the idea of convincing folks who have no interest. Where the hell do folks find the time.
Honestly what we have now is AI. As in it is not intelligent just trys to mimic it.
Digital Intelegence if we ever achive it would be a more accurate name.
Well now I suddenly care.
Why the hell do you want to watch the world burn?
;)