Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 13 submissions in the queue.

Log In

Log In

Create Account  |  Retrieve Password


Site News

Join our Folding@Home team:
Main F@H site
Our team page


Funding Goal
For 6-month period:
2022-07-01 to 2022-12-31
(All amounts are estimated)
Base Goal:
$3500.00

Currently:
$438.92

12.5%

Covers transactions:
2022-07-02 10:17:28 ..
2022-10-05 12:33:58 UTC
(SPIDs: [1838..1866])
Last Update:
2022-10-05 14:04:11 UTC --fnord666

Support us: Subscribe Here
and buy SoylentNews Swag


We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.

The shambling corpse of Steve Jobs lumbers forth, heeding not the end of October! How will you drive him away?

  • Flash running on an Android phone, in denial of his will
  • Zune, or another horror from darkest Redmond
  • Newton, HyperCard, or some other despised interim Apple product
  • BeOS, the abomination from across the sea
  • Macintosh II with expansion slots, in violation of his ancient decree
  • Tow his car for parking in a handicap space without a permit
  • Oncology textbook—without rounded corners
  • Some of us are still in mourning, you insensitive clod!

[ Results | Polls ]
Comments:23 | Votes:55

posted by hubie on Saturday September 14, @10:07PM   Printer-friendly
from the what-about-for-the-port-that-runs-on-my-refrigerator? dept.

Arthur T Knackerbracket has processed the following story:

The first-person shooter Doom has so many ports on so many different consoles and computers that modders have had to find new places to port the game like autonomous lawnmowers, digestive bacteria and even in Doom II itself.

One port that’s not nearly as popular or playable as the others is the Sega Saturn port that came out nearly four years after the game’s release. Gamespot’s Jeff Gerstmann called the Sega Saturn Doom port just about everything you can call a bad game without straying over the the boundaries of good taste: “completely worthless,” “drab,” “jerky,” “to be avoided at all costs.”

Bo, a self-described reverse engineer of Sega Saturn games, gave the Sega Saturn port of Doom another chance and he discovered a cheat code in the game that’s been laying dormant for more than a decade. He posted the secret cheat code he found on X.

The button combination X, Right, B, Y, X, Right, B, Y gives you the ability to see through the walls of the Mars substation and even Hell. It’s too bad the game doesn’t have a cheat code that lets you see a better version of Doom.


Original Submission

posted by hubie on Saturday September 14, @05:23PM   Printer-friendly
from the down-down-down-and-the-flames-went-higher dept.

https://arstechnica.com/gadgets/2024/09/music-industrys-1990s-hard-drives-like-all-hdds-are-dying/

One of the things enterprise storage and destruction company Iron Mountain does is handle the archiving of the media industry's vaults. What it has been seeing lately should be a wake-up call: roughly one-fifth of the hard disk drives dating to the 1990s it was sent are entirely unreadable.

Music industry publication Mix spoke with the people in charge of backing up the entertainment industry. The resulting tale is part explainer on how music is so complicated to archive now, part warning about everyone's data stored on spinning disks.

"In our line of work, if we discover an inherent problem with a format, it makes sense to let everybody know," Robert Koszela, global director for studio growth and strategic initiatives at Iron Mountain, told Mix. "It may sound like a sales pitch, but it's not; it's a call for action."
[...]
Mix's passing along of Iron Mountain's warning hit Hacker News earlier this week, which spurred other tales of faith in the wrong formats. The gist of it: You cannot trust any medium, so you copy important things over and over, into fresh storage. "Optical media rots, magnetic media rots and loses magnetic charge, bearings seize, flash storage loses charge, etc.," writes user abracadaniel. "Entropy wins, sometimes much faster than you'd expect."

There is discussion of how SSDs are not archival at all;
[...]
Knowing that hard drives will eventually fail is nothing new. Ars wrote about the five stages of hard drive death, including denial, back in 2005.
[...]
Google's server drive data showed in 2007 that HDD failure was mostly unpredictable, and that temperatures were not really the deciding factor.


Original Submission

posted by janrinok on Saturday September 14, @12:39PM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

DNA analysis shows that people from Easter Island had contact with Indigenous Americans around the 1300s, and finds there was no population crash before the arrival of Europeans

DNA analysis of ancient remains from Easter Island shows that the population was in fact increasing when Europeans arrived, rather than collapsing as reported by some historical accounts.

The results also show that there were interactions between the residents of the island and those of South America long before the arrival of Europeans. Both the island and its people are also known as Rapa Nui.

Located in the Pacific Ocean 3500 kilometres from South America, Rapa Nui is one of the most remote inhabited islands on Earth. Polynesian people began settling there around AD 1200, when its 164 square kilometres were covered in palm forests.

By the time Europeans arrived in 1722, the vegetation had been largely destroyed by a combination of rats and overharvesting. The history of the island has often been portrayed as an example of unsustainable ecological exploitation and population growth followed by collapse.

In the latest study, J. Víctor Moreno-Mayar at the University of Copenhagen, Denmark, and his colleagues looked at 15 sets of human remains kept at the National Museum of Natural History in Paris, France, collected by expeditions in 1877 and 1935.

The researchers worked closely with representatives of the Rapa Nui community. One of their aims was to confirm that the individuals at the museum were, in fact, from the island, as there is an effort being led by modern residents to repatriate the remains.

The results show that the 15 people, who all died over the past 500 years, did originate on Rapa Nui.

A population undergoing a bottleneck from a collapse in numbers will have signals in their DNA showing a drop in genetic diversity, says Moreno-Mayer.

“We are using statistical methods that can reconstruct the genetic diversity in the Rapa Nui population throughout the last few thousand years,” he says. “And interestingly enough, we do not find any evidence of a dramatic population decline around 1600s as expected from the collapse theory.”

Instead, the results suggest that the Rapa Nui population increased steadily until the 1860s, when slave traders kidnapped hundreds of islanders and a smallpox outbreak killed many more.

The study also identified stretches of DNA in the ancient Rapa Nui genomes that have an Indigenous American origin. Their analysis suggests that the mixing of these populations occurred around the 1300s.

“Our interpretation is that the ancestors of Rapa Nui first peopled the island and shortly after made a return journey to the Americas,” says Moreno-Mayer.

Previous studies have also cast doubt on the story of a population collapse. Carl Lipo at Binghamton University in New York says it was “terrific” to learn that a completely independent line of evidence points to the same conclusions his team reached in a paper published earlier this year, using radiocarbon and archaeological evidence.

He says the study confirms that the island was populated with people who lived resiliently and successfully until the arrival of Europeans.

JournalReference: Nature DOI: 10.1038/s41586-024-07881-4


Original Submission

posted by janrinok on Saturday September 14, @07:54AM   Printer-friendly
from the Manchester-England-not-Manchester-Michigan dept.

Several sites have covered the dynamic pricing scandal concerning Tickemaster's sales of tickets to the Manchester based English rock band Oasis' reunion tour. Aside from the problems of the monopoly maintained by Ticketmaster, and aside from the problem of ticket scalping which is encouraged by Ticketmaster's business model, the dynamic pricing has come across as price gouging and a possible breach of consumer law. The Competition and Markets Authority is now launching an investigation into if or how much Ticketmaster engaged in unfair, prohibited commercial practices.

Some fans paid more than £350 for tickets with a face value of less than £150, and had to make a split-second decision whether to complete their purchase, as dynamic pricing caused prices to soar during the booking process.

Lisa Webb, a consumer law expert at Which?, said: "It seems extremely unfair that Oasis fans got up early and battled through queues only to find that ticket prices had more than doubled from the originally advertised price.

"Oasis and Ticketmaster should do the right thing and refund fans who may have been misled into paying over the odds for tickets that would have been half the price just hours earlier."

Oasis and Ticketmaster urged to refund fans after 'dynamic pricing' debacle, The Guardian.

Where have Soylentils been seeing dynamic pricing lately?

This has forced the band to issue a press release distancing the band from Ticketmaster and its practices.

The band released a statement on Wednesday evening denying they were behind the dynamic pricing.

"It needs to be made clear that Oasis leave decisions on ticketing and pricing entirely to their promoters and management, and at no time had any awareness that dynamic pricing was going to be used," said the statement.

It said that "meetings between promoters, Ticketmaster and the band's management" had resulted in an agreement to use dynamic pricing "to help keep general ticket prices down as well as reduce touting".

However, "the execution of the plan failed to meet expectations".

UK launches investigation into Ticketmaster's pricing for Oasis reunion tour, France24.

Also at:

U.K. is investigating Ticketmaster after Oasis tour prices surprised fans, NPR.
Oasis fans' fury over 'in demand' tickets as prices rocket - as Ticketmaster issues statement, Manchester Evening News.
Ticketmaster's Dynamic Pricing Faces U.K. Government Investigation After Oasis Reunion Tour Sale, Rolling Stone.
Oasis ticketing chaos prompts probe into dynamic pricing, The Verge.
Hundreds lodge complaints over Oasis ticket prices, BBC.
Oasis Says Ticketmaster & Management Are Responsible for High Ticket Prices, Digital Music News.
Ticketmaster Officially Faces CMA Investigation Over Oasis Ticket Prices — As the Regulator Also Scrutinizes Dynamic Pricing's 'Broader Competition and Consumer Issues', Digital Music News.
Oasis Fans Complain Of Ticketmaster Errors, Long Waits & Price Surges For Reunion Tour, Deadline.
Oasis Fans Face Crashes, Bots and Dynamic Pricing as Reunion Tickets Go On Sale, Rolling Stone.
Oasis fans react to Ticketmaster's dynamic pricing, NME.
What is dynamic pricing and how does it work? , Irish Examiner.

Previously:
(2024) We're Entering an AI Price-Fixing Dystopia
(2018) Ticketmaster Plans to Roll Out Facial Recognition. What Could Go Wrong?
(2016) Surge Pricing Arrives in Disney's Magic Kingdom
(2015) How Amazon Tricks you into Thinking it Always has the Lowest Prices


Original Submission

posted by janrinok on Saturday September 14, @03:05AM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

The RX 7800M is a powerful mobile GPU for gaming notebooks.

AMD has officially debuted its sixth discrete GPU in its mobile RX 7000 lineup, the RX 7800M. The new GPU is AMD's second mobile RDNA 3 GPU to arrive with a chipset-style architecture and is the runner-up to the flagship RX 7900M.

The RX 7800M is armed with 60 RDNA 3 compute units, 96 ROPS, 3,840 stream processors, 48MB of Infinity Cache, and a game clock of 2,145MHz. Bus width was not mentioned, but we suspect it is using a 192-bit interface. Memory bandwidth is rated at up to 432GB/s, memory capacity is 12GB, and GDDR6 ICs operate at up to 18 Gbps. GPU power consumption is rated at up to 180W.

AMD's new chipset-style mobile GPU is essentially a stripped-down RX 7800 XT operating at lower clock speeds and power consumption combined with lower memory specs from the RX 7700 XT. The GPU's compute unit count also aligns perfectly with the new Sony PS5 Pro's Compute Units, meaning the 7800M most likely would have the same compute power as the PS5 Pro in a theoretical scenario where GPU clocks and power consumption were the same.

We previously discovered that the RX 7800M performs very similarly to AMD’s desktop RX 7700 XT in some vendor-provided benchmarks. This is unsurprising since both GPUs share the same memory configuration, and the GPU’s superior core count configuration offsets the 7700 XT’s low power/clock speed. Compared to Nvidia, the RX 7800M performs faster than its RTX 4070 laptop GPU but is slower than its RTX 4080 mobile counterpart. Performance was also a touch behind Nvidia’s desktop RTX 4070.


Original Submission

posted by janrinok on Friday September 13, @10:22PM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

CTIA, the trade organization representing the US wireless industry, said the additional 26 trillion MBs used last year is a 36 percent increase over 2022 and is the largest single-year increase in wireless data ever. It is also enough data for every household in the country to watch the first season of House of the Dragon daily for an entire year.

By 2029, Ericsson predicts that Americans' data usage could increases by more than three times the current rate.

The continued proliferation of 5G networks is helping to drive growth as well. The CTIA said that by the end of 2023, nearly 40 percent of all wireless connections – including smartphones, IoT devices, and wearables – were 5G and that more than 330 million Americans were covered by at least one 5G network. The total number of wireless connections reached 558 million, or more than 1.6 connections for each American.

The trend is only expected to increase in the coming years as network operators pump even more money into the system. The industry collectively invested $30 billion in 2023 to improve their networks, pushing the total US wireless industry spend to more than $700 billion to date ($190 billion of which has come since 2018). A total of 432,469 cell sites were in operation across the country at the end of 2023, an increase of 24 percent since 2018.

Wireless data is also more affordable now than it ever has been. The cost per MB has dropped 50 percent since 2020 and 97 percent versus a decade ago, down to just $.002 per MB.


Original Submission

posted by janrinok on Friday September 13, @05:35PM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

Azure Linux is Microsoft's take on the open source operating system. It is primarily used for internal purposes, but could it become (yet another) distribution option?

Directions on Microsoft analyst Mary Jo Foley suggests the distribution, tuned to be lightweight and secure, has the potential to reach a wider audience.

Because, let's face it, if there's one thing the Linux world needs, it's another distribution for administrators to consider.

Azure Linux was known as CBL-Mariner before it was rebranded, and thank your lucky stars that happened in 2023. Lately, it would probably end up being called Copilot for Linux or something similar.

Downloadable from GitHub, Azure Linux can be found running as a container host operating system for the Azure Kubernetes Service (AKS) and supports both x86 and Arm.

The latter point is significant. There is currently no publicly supported version of Windows Server that runs on Arm, despite Microsoft hyping Arm technology via its Copilot+ PCs and datacenter operators increasingly favoring the hardware's lower power-sipping tendencies. While porting and supporting all of Windows Server's functions to the Linux platform would be a stretch, there is the potential for Microsoft to compete in the Linux enterprise server space.

Foley noted that the world probably doesn't need another Linux distribution. However, the end of support for CentOS has opened up a window of opportunity – even for Microsoft.

"More customer compute in Azure is running Linux on Azure than Windows Server on Azure," according to Foley. Thus, it is hard to think that Microsoft would not like to be part of that besides hosting the workloads.

And then there is Amazon Linux 2, a Linux operating system from Microsoft's arch-cloud rival AWS, which is provided free of additional charge and described as a "security-focused, stable, and high-performance execution environment to develop and run cloud applications." AWS also provides ongoing security and maintenance updates.

If only Microsoft had something similar.

Microsoft's social-media-for-suits platform LinkedIn recently moved from CentOS to Azure Linux. The experience was doubtless a challenge, but, as we noted then: "This can only be good for Azure Linux, and indeed, for Azure in general."

Does the future of Azure Linux lie somewhere other than a relatively obscure way to host containers on AKS? Foley asked Microsoft and was told: "Azure Linux for VM or bare metal use is not available as a commercially supported offering today. Support is limited to AKS as the host OS."

Note the word "today" in that response.

Microsoft is unlikely to make much money directly from Azure Linux going wide. However, it would be a useful driver to the company's Azure cloud platform and soothe concerns over support and maintenance.

However, for many administrators, an attitude of "Anything but Microsoft" persists, certainly since Steve Ballmer's decades-old bonkers "Linux is a cancer" comment. Persuading these same admins that Microsoft can be a trustworthy Linux partner is a challenge that should not be underestimated. ®


Original Submission

posted by hubie on Friday September 13, @12:47PM   Printer-friendly
from the lawyer-up dept.

https://arstechnica.com/tech-policy/2024/09/elon-musks-x-wins-appeal-to-block-california-content-moderation-law/

Elon Musk's X has won its appeal on free speech grounds to block AB 587, a California law requiring social media companies to submit annual reports publicly explaining their controversial content moderation decisions.

In his opinion, Ninth Circuit court of appeals judge Milan Smith reversed a district court's ruling that he said improperly rejected Musk's First Amendment argument. Smith was seemingly baffled to find that the "district court performed, essentially, no analysis on this question."
[...]
X accused California of trying to spark backlash with a supposed "transparency measure" that forces "companies like X Corp. to engage in speech against their will" by threatening "draconian financial penalties" if companies don't "remove, demonetize, or deprioritize constitutionally protected speech that the state deems undesirable or harmful."

Smith said that the appeals court accounted for these alleged effects in its analysis, but "whether State officials intended these effects plays no role in our analysis of the merits" of X's case.

That's likely because the appeals court agreed that X was likely to prevail in its First Amendment claims, finding that AB 587 compels noncommercial speech that requires strict scrutiny. The law also is not narrowly tailored enough "to serve the State's purported goal of requiring social media companies to be transparent about their policies and practices." As Smith wrote, if the law is just a transparency measure, "the relevant question here is: transparency into what?"
[...]
If AB 587 only required companies to disclose "whether it was moderating certain categories of speech without having to define those categories in a public report," that might work.
[...]
Instead, AB 587's provisions require "every covered social media company to reveal its policy opinion about contentious issues, such as what constitutes hate speech or misinformation and whether to moderate such expression," Smith wrote.

"Even a pure 'transparency' measure, if it compels non-commercial speech, is subject to strict scrutiny," Smith wrote, concluding that X would likely suffer irreparable harm if key parts of the law weren't blocked.
[...]
Smith ordered the case to be remanded to the district court "with instructions to enter a preliminary injunction consistent with the opinion." The district court will also have to determine if unconstitutional parts of the law "are severable from the remainder of AB 587 and, if so, which, if any, of the remaining challenged provisions should also be enjoined."

This is the outcome that the state had asked for if the appeals court sided with X, giving California a fighting chance to preserve some parts of the law. But if the district court decides to strike the entire content moderation report section from the law, AB 587 would be properly gutted—basically only requiring social media companies to post their terms of service on a government website. That's the only part of the law that X did not fight to enjoin on appeal.


Original Submission

posted by hubie on Friday September 13, @08:03AM   Printer-friendly
from the alt.chrome.north.korea dept.

On August 19, 2024, Microsoft identified a North Korean threat actor exploiting a zero-day vulnerability in Chromium, now identified as CVE-2024-7971, to gain remote code execution (RCE). We assess with high confidence that the observed exploitation of CVE-2024-7971 can be attributed to a North Korean threat actor targeting the cryptocurrency sector for financial gain:

Our ongoing analysis and observed infrastructure lead us to attribute this activity with medium confidence to Citrine Sleet. We note that while the FudModule rootkit deployed has also been attributed to Diamond Sleet, another North Korean threat actor, Microsoft previously identified shared infrastructure and tools between Diamond Sleet and Citrine Sleet, and our analysis indicates this might be shared use of the FudModule malware between these threat actors.

CVE-2024-7971 is a type confusion vulnerability in the V8 JavaScript and WebAssembly engine, impacting versions of Chromium prior to 128.0.6613.84. Exploiting the vulnerability could allow threat actors to gain RCE in the sandboxed Chromium renderer process. Google released a fix for the vulnerability on August 21, 2024, and users should ensure they are using the latest version of Chromium.

Who is Citrine Sleet?

The threat actor that Microsoft tracks as Citrine Sleet is based in North Korea and primarily targets financial institutions, particularly organizations and individuals managing cryptocurrency, for financial gain. As part of its social engineering tactics, Citrine Sleet has conducted extensive reconnaissance of the cryptocurrency industry and individuals associated with it. The threat actor creates fake websites masquerading as legitimate cryptocurrency trading platforms and uses them to distribute fake job applications or lure targets into downloading a weaponized cryptocurrency wallet or trading application based on legitimate applications. Citrine Sleet most commonly infects targets with the unique trojan malware it developed, AppleJeus, which collects information necessary to seize control of the targets' cryptocurrency assets. The FudModule rootkit described in this blog has now been tied to Citrine Sleet as shared tooling with Diamond Sleet.

The article goes on to explain the exploit and FudModule rootkit, and ends with a long list of recommendations.

Originally spotted on Schneier on Security.

Previously: North Korean Hackers Unleashed Chrome 0-Day Exploit on Hundreds of US Targets


Original Submission

posted by hubie on Friday September 13, @03:15AM   Printer-friendly

SpaceX founder has said humans will be able to go to Mars in just four years:

The 53-year-old businessman made his predictions on a series of social media posts this weekend. He said the next "Earth-Mars transfer window" opens in two years, which is when the first Starships to the "Red Planet" will launch. Musk said the Starships will be uncrewed at first "to test the reliability of landing intact on Mars."

But if everything goes well and the landings are successful, just two years later the first crewed flights to Mars will start departing from our planet. Musk said once the first crewed flights depart, their rate will "grow exponentially", adding that his company has the goal of "building a self-sustaining city in about 20 years."

[...] "Being multiplanetary will vastly increase the probable lifespan of consciousness, as we will no longer have all our eggs, literally and metabolically, on one planet." Many people were excited by Musk's latest claims as one wrote: "This is huge!!" Another added: "What a time to be alive!" One more commented: "The mission to make life multi-planetary really begins."

Founded in 2002, Musk's SpaceX became the first private company to develop a liquid-propellant rocket to reach orbit and the first to send a spacecraft and astronauts to the International Space Station. A year earlier, he had announced the development of Mars Oasis - a project bidding to land a greenhouse and grow plants on Mars.

The stainless-steel Starship is made up of a first-stage booster called Super Heavy and a 165-foot-tall upper-stage spacecraft known as Starship. The spacecraft is designed to be "a fully reusable transportation system designed to carry both crew and cargo to Earth orbit, the Moon, Mars and beyond."


Original Submission

posted by janrinok on Thursday September 12, @10:23PM   Printer-friendly

Leaked Disney+ financials may shed light on recent price hike:

A leak of data from Disney points to the Disney+ streaming service making about $2.4 billion in revenue in its fiscal quarter ending on March 30. Disney doesn't normally share how much revenue its individual streaming services generate, making this figure particularly interesting.

In August, Disney confirmed that it was investigating the leak of "over a terabyte of data from one of the communication systems" it uses. In a report this week, The Wall Street Journal (WSJ) said it looked over files leaked by a hacking group called Nullbulge that include "a range of financial and strategy information," apparent login credentials for parts of Disney's cloud infrastructure, and more. The leak includes over "44 million messages from Disney's Slack workplace communications tool, upward of 18,800 spreadsheets, and at least 13,000 PDFs," WSJ said.

"We decline to comment on unverified information The Wall Street Journal has purportedly obtained as a result of a bad actor's illegal activity," a Disney spokesperson told WSJ.

According to WSJ, financial information came via "documents shared by staffers that detail company operations," adding, "It isn't official data of the sort Disney discloses to Wall Street and might not reflect final financial performance for a given period." That means we should take these figures with a grain of salt.

"Internal spreadsheets suggest that Disney+ generated more than $2.4 billion in revenue in the March quarter," WSJ reported, referencing Disney's fiscal Q2 2024. "It underscores how significant a revenue contributor Hulu is, particularly as Disney seeks to buy out Comcast's stake in that streaming service, and as the two sides spar over its value."

The publication noted that the $2.4 billion figure represents "about 43 percent"—42.5 percent to be more precise—of the direct-to-consumer (DTC) revenue that Disney reported that quarter, which totaled $5,642,000,000 [PDF]. In its Q2 report, Disney put Disney+, Hulu, and Disney+ Hotstar under its DTC umbrella. DTC revenue in Q2 represented a 13 percent increase compared to the same quarter in the prior fiscal year.

Further, subscriber counts for Disney+ and Hulu increased year over year in Q2. The leaks didn't specify how much revenue Disney's streaming businesses made in Q3, but Disney reported that DTC revenue increased to $5.8 billion [PDF].

Right before announcing its Q3 numbers, though, Disney announced price hikes across Disney+, Hulu, and ESPN+ by as much as 25 percent. As we wrote at the time, the price hike seemed like an attempt to push people toward bundle packages offering a combination of Disney+, Hulu, and/or ESPN+ (bundles are supposed to make subscriber churn less likely). Disney CFO Hugh Johnston tried convincing us that Disney's streaming catalog meant that it had "earned" the streaming price hikes.

But the recently leaked numbers shed a little more light on the situation.


Original Submission

posted by janrinok on Thursday September 12, @05:41PM   Printer-friendly

https://blog.cloudflare.com/pingora-saving-compute-1-percent-at-a-time/

Cloudflare's global network handles a lot of HTTP requests – over 60 million per second on average. That in and of itself is not news, but it is the starting point to an adventure that started a few months ago and ends with the announcement of a new open-source Rust crate that we are using to reduce our CPU utilization, enabling our CDN to handle even more of the world's ever-increasing Web traffic.

Motivation

Let's start at the beginning. You may recall a few months ago we released Pingora (the heart of our Rust-based proxy services) as an open-source project on GitHub. I work on the team that maintains the Pingora framework, as well as Cloudflare's production services built upon it. One of those services is responsible for the final step in transmitting users' (non-cached) requests to their true destination. Internally, we call the request's destination server its "origin", so our service has the (unimaginative) name of "pingora-origin".

One of the many responsibilities of pingora-origin is to ensure that when a request leaves our infrastructure, it has been cleaned to remove the internal information we use to route, measure, and optimize traffic for our customers. This has to be done for every request that leaves Cloudflare, and as I mentioned above, it's a lot of requests. At the time of writing, the rate of requests leaving pingora-origin (globally) is 35 million requests per second.


Original Submission

posted by janrinok on Thursday September 12, @12:55PM   Printer-friendly
from the dun-dun-duuun! dept.

https://arstechnica.com/security/2024/09/rogue-whois-server-gives-researcher-superpowers-no-one-should-ever-have/

It's not every day that a security researcher acquires the ability to generate counterfeit HTTPS certificates, track email activity, and execute code of his choice on thousands of servers—all in a single blow that cost only $20 and a few minutes to land. But that's exactly what happened recently to Benjamin Harris.

Harris, the CEO and founder of security firm watchTowr, did all of this by registering the domain dotmobilregistry.net. The domain was once the official home of the authoritative WHOIS server for .mobi
[...]
Harris noticed that the previous dotmobiregistry.net owners had allowed the domain to expire. He then scooped it up and set up his own .mobi WHOIS server there.

To Harris's surprise, his server received queries from slightly more than 76,000 unique IP addresses within a few hours of setting it up. Over five days, it received roughly 2.5 million queries from about 135,000 unique systems. The entities behind the systems querying his deprecated domain included a who's who of Internet heavyweights comprising domain registrars, providers of online security tools, governments from the US and around the world, universities, and certificate authorities, the entities that issue browser-trusted TLS certificates that make HTTPS work.

"watchTowr's research has demonstrated that trust placed in this process by governments and authorities worldwide should be considered misplaced at this stage, in [our] opinion," Harris wrote in a post documenting his research.
[...]
WHOIS has played a key role in Internet governance since its earliest days, back when it was still called the ARPANET. Elizabeth Feinler, an information scientist working for the Augmentation Research Center, became the principal investigator for NIC, short for the Network Information Center project, in 1974. Under Feinler's watch, NIC developed the top-level domain naming system and the official host table and published the ARPANET Directory, which acted as a directory of phone numbers and email addresses of all network users. Eventually, the directory evolved into the WHOIS system, a query-based server that provided a comprehensive list of all Internet host names and the entities that had registered them.

Despite its antiquated look and feel, WHOIS today remains an essential resource with tremendous consequences.
[...]
Harris populated his WHOIS database with junk data that corresponded to all real .mobi addresses. Administrative email addresses, and most other fields led to the watchtowr.com domain. For humor, he also added ASCII art.
[...]
The humor aside, the rogue WHOIS server gave him powers he never should have had. One of the greatest was the ability to dictate the email address certificate authority GlobalSign used to determine if a party applying for a TLS certificate was the rightful owner of the domain name the certificate would apply to. Like the vast majority of its competitors, GlobalSign uses an automated process. An application for example.com, for instance, will prompt the certificate authority to send an email to the administrative email address listed in the authoritative WHOIS for that domain. If the party on the other end clicks a link, the certificate is automatically approved.

When Harris generated a certificate signing request for microsoft.mobi, he promptly received an email from GlobalSign. The email gave him the option of receiving a verification link at whois@watchtowr.com. For ethical reasons, he stopped the experiment at this point.
[...]
"The purchase of a $20 domain that allowed the passive inference of .gov/.mil communications and the subversion of the Certificate Authority verification system should be a clear demonstration that the integrity of the trust and security processes we as Internet users rely on is, and continues to be, extremely fragile," Harris wrote in an online interview. "The systems and security we all take for granted is, in many places, truly held together in ways that would not pass approval in 2024."


Original Submission

posted by hubie on Thursday September 12, @08:14AM   Printer-friendly

A wider global trend that will see V2X technology become the standard in most vehicles:

The future connected vehicle does not just use a standard smartphone cellular connection but also takes advantage of dedicated V2X safety communication channels. V2X, which stands for Vehicle-to-Everything, uses either Wi-Fi or cellular-based technology to facilitate communication with other vehicles and traffic infrastructure. If regulation or safety standards mandate this technology, then V2X is set to become the "digital seatbelt" of the future, promising to reduce accidents, improve congestion, and reduce emissions globally by allowing vehicle safety systems to talk to each other and to city traffic infrastructure, even in the pouring rain, dense fog, or busy carparks.

The two most popular technologies for V2X, DSRC [Dedicated short-range communications], and C-V2X [Cellular-Vehicle-to-Everything], both require different hardware. DSRC is based on Wi-Fi protocols, and C-V2X is based on 4G or 5G protocols. Currently, there are approximately 1 million V2X-connected vehicles on the road globally, with those mainly concentrated in Europe and China. About half the market is using DSRC-based technology, and the other half of the market is using C-V2X technology, with most of these vehicles being available in China.

IDTechEx is forecasting a significant market shift towards C-V2X technology, with over 90% of the market forecasted to be using 5G-based C-V2X technology by 2034. The biggest contribution to this shift is regulation — the two largest vehicle markets in the world, the US and China, both have governmental organizations actively pushing for C-V2X adoption and have formally abandoned DSRC technology.

[...] If a technology is included in a New Car Assessment Program (NCAP), OEMs aiming to achieve a high safety rating must include it in order to pass certain tests. China has announced V2X inclusion in the CNCAP from 2024 onwards, which is set to result in significant growth for the technology in China. Many manufacturers target a 5-star score in NCAPs, as NCAP scores can significantly impact sales.

[...] One area where V2X could make the largest impact is for autonomous vehicles (AVs). The number and sophistication of sensors in an autonomous vehicle are vast and increase with the level of autonomy. AVs like those in Phoenix or San Francisco currently depend on LiDAR [light detection and ranging], radar, and cameras for the majority of their perception. Each sensor fulfills important functions and ensures robust and safe operation, but these vehicle sensor systems are limited by line-of-sight. Using either DSRC or C-V2X, autonomous vehicles can transmit information at a dedicated frequency (~5.9GHz), with V2X acting as an extra sensor that works in all weather conditions and can go through walls and obstacles, effectively solving the line-of-sight problem. The main feasible method for achieving this is to use V2X to broadcast the location-related information of each car. A connected vehicle receiving the information can calculate the possibility of collision with the other vehicle using onboard compute. If the risk is high, the driver (or passenger of an autonomous vehicle) will be immediately warned, and the system will adjust accordingly to avoid a collision safely and effectively.


Original Submission

posted by hubie on Thursday September 12, @03:28AM   Printer-friendly
from the Advertising-Ruins-Everything dept.

https://therecord.media/ford-patent-application-in-vehicle-listening-advertising

Ford Motor Company is seeking a patent for technology that would allow it to tailor in-car advertising by listening to conversations among vehicle occupants, as well as by analyzing a car's historical location and other data, according to a patent application published late last month.

"In one example, the controller may monitor user dialogue to detect when individuals are in a conversation," the patent application says. "The conversations can be parsed for keywords or phrases that may indicate where the occupants are traveling to."

The tech — labeled as "in-vehicle advertisement presentation" — will determine where a car is located, how fast it is traveling, what type of road it is driving on and whether it is in traffic. It also will predict routes, speeds and destinations to customize ads to drivers, the application said.

The system could pull data from "audio signals within the vehicle and/or historical user data, selecting a number of the advertisements to present to the user during the trip," the patent application said.


Original Submission