The Hunt is On: ‘The Witcher 3: Wild Hunt’ Next-Gen Update Coming to GeForce NOW

0
Image credit: NVIDIA

The Witcher 3: Wild Hunt next-gen update will be available on GeForce NOW starting next week. 

In a statement, NVIDIA said members can now watch new seasons of Fortnite and Genshin Impact, as well as eight new games added to the library.

Furthermore, the latest GeForce NOW app will be available this week with support for syncing members’ Ubisoft Connect library of games, allowing them to get into their favourite Ubisoft games even faster.

Additionally, gamers in the UK, Netherlands, and Poland have the first opportunity to purchase the 13.3-inch HP Chromebook x360, designed for extreme multitasking and features an adaptive 360-degree design. A free GeForce NOW Priority membership for one month is included with every Chromebook purchase.

Triss the Season

On Wednesday, 14 December, CD PROJEKT RED will release the next-gen update for The Witcher 3: Wild Hunt — Complete Edition. Anyone who purchased the game from Steam, Epic Games, or GOG.com is eligible for the update for free, and GeForce NOW subscribers can benefit from improved visuals on almost all of their devices.

A new photo mode, significantly improved visuals, and content drawn from Netflix‘s The Witcher series are all included in the next-generation update. It also includes RTX Global Illumination, ray-traced ambient occlusion, shadows, and reflections, which give the game more cinematic detail.

Without having to wait for the update to download and instal, play as Geralt of Rivia on a mission to find his adopted daughter Ciri, the Child of Prophecy and the bearer of the potent Elder Blood, across all your devices. On almost any device, including Macs, mobile devices, and more, GeForce NOW RTX 3080 and Priority members can play with RTX ON and NVIDIA DLSS to explore The Witcher’s stunning open world at high frame rates.

Get in Sync

The GeForce NOW 2.0.47 app update will be available this week, and it will include support for syncing Ubisoft Connect accounts with your GeForce NOW library.

Members will be able to access their Ubisoft games more quickly and easily with this new game-library sync for Ubisoft Connect. When streaming supported GeForce NOW games purchased directly from Ubisoft or Epic Games Store, members will be automatically logged into their Ubisoft account across all devices once synced. These include games like Rainbow Six Siege and Far Cry 6.

The update also includes bug fixes and enhancements to voice chat with Chromebook built-in microphones. The update will be available for PC, Mac, and browser clients in the coming days.

‘Tis the Season

This week’s updates to some of the hottest titles streaming from the cloud and eight new games to play ensure that the action never stops on GeForce NOW.

Members can now play Fortnite Chapter 4, which includes a new island, newly forged weapons, a new realm, and new ways to get around, such as riding a dirt bike or rolling around in a snowball. A new cast of combatants, including Geralt of Rivia, is also available.

Genshin Impact’s Version 3.3 “All Senses Clear, All Existence Void” is available to stream on GeForce NOW as well, introducing a new season of events, a new card game called the Genius Invokation TCG, and two powerful allies — the Wanderer and Faruzan — for more stories, fun and challenges.

Advertisement

Supply chain security continues to ripple through chip sector

0
The global tech sector has come under considerable pressure this year from rising interest rates, elevated inflation, and a slowing global economy. (ddp) Image credit: UBS
Media Release by UBS

US President Joe Biden heralded the news as a big win after supply chain issues disrupted the US economy during the COVID-19 pandemic. TSMC’s expanded investment in Arizona is the latest in a string of major investments by chipmakers since the CHIPS and Science Act passed this summer, including IBM and Micron.


Separately, Dutch officials are reportedly planning to enforce new controls on exports of chipmaking equipment to China, according to Bloomberg News on Wednesday. Since 2018, the Netherlands has prohibited its largest company, semiconductor-equipment maker ASML Holdings, from exporting its most advanced machines to China due to their potential military application, Bloomberg reported.

The global tech sector has come under considerable pressure this year from rising interest rates, elevated inflation, and a slowing global economy. We expect global IT to remain under pressure in the near term, and we maintain a least preferred stance on the sector.

But we continue to believe the structural theme surrounding the era of security, including securing supply chains, remains compelling. We believe that energy, food, and technological security will be increasingly prioritized by governments and businesses, even at the cost of efficiency. This will likely create winners and losers across the investment landscape in the decade ahead.

Cybersecurity. We believe that the rise in remote working, alongside perceived threats from both state and non-state actors, will drive an increased focus on cybersecurity solutions. Although we are cautious on the global IT sector in the near term, we think that cybersecurity companies are relatively defensive within the technology sector because security is seen as essential, and companies and governments tend to maintain spending even in the face of economic downturns.

Indeed, underlying spending on cybersecurity remained resilient in recent results, and cybersecurity stocks held up better than other tech peers during the rout this year.

Energy security. Plans to transition energy production away from fossil fuels have been underway for years. But Russia’s invasion of Ukraine and subsequent disruptions to European energy supplies are likely to accelerate those plans—as much on security grounds as on environmental ones. We expect the era of security to drive energy-related commodity prices higher over the long term.

A focus on sourcing supplies from allied nations, structural underinvestment, efforts to achieve net-zero emissions, and a need to meet growing emerging market demand should all help support prices. We also continue to expect companies linked to renewable energy solutions to see increased demand in the years ahead.

Food security. More broadly, the security and safety theme also includes threats to the global food chain and to air, water, and soil quality. The food price shocks stemming from supply disruptions related to COVID-19, climate change, and the Russia-Ukraine war are driving governments and international organizations to rethink all aspects of food security, in our view. The need for food security will favor stocks linked to improving agricultural yields, reducing environmental damage, saving water, and adapting to climate change. Efforts to improve efficiencies across the supply chain will drive opportunities in areas including smart agriculture, alternative protein, and logistics.

So, we believe that energy, food, and technological security will be increasingly prioritized by governments and businesses, creating investment opportunities in the decade ahead.

Advertisement

‘Collaborative robots’ working with employees

Image credit: Queensland University of Technology
Media Release by Queensland University of Technology

The Australian Research Council (ARC) Deputy Chief Executive Officer Dr Richard Johnson, officially launched the Australian Cobotics Centre in Brisbane yesterday.

Led by the QUT, the Australian Cobotics Centre (ARC Training Centre for Collaborative Robotics in Advanced Manufacturing) is working closely with Australian businesses to shape the innovative use of collaborative robots, or ‘cobotics’, to combine the strengths of humans and robots in shared work environments.

The ARC awarded QUT about $5 million over 5 years for the Australian Cobotics Centre in the 2020 round of the ARC Industrial Transformation Research Program. The Centre is further supported by $2.73 million in funding from participating universities and partner organisations and $6.8 million in in-kind support.

In officially launching the initiative, Dr Johnson highlighted the capacity of well-designed initiatives that bring university researchers and manufacturing businesses together to boost Australia’s competitiveness in advanced manufacturing.

ARC Industrial Transformation Training Centres support collaboration between Australia’s most innovative researchers and industries creating the workforce of the future that will seamlessly transition between industry and the research institutions.

“This training centre, which is focused on the practical application of robotics within manufacturing, has created a structured, intergenerational research and translation environment, where new and emerging researchers work with highly experienced researchers, and they do so side-by-side with Australian manufacturers who bring practical business needs and opportunities,” Dr Johnson said.

In this collaboration, QUT has partnered with Swinburne University of Technology, UTS, the Technical University of Dortmund, and industry partners inlcuding ARM Hub, B&R Enclosures, Cook Medical, InfraBuild, IR4 and Weld Australia.

Advertisement

Intel, Habana Labs and Hugging Face advance deep learning software

Image credit: Intel

Intel, Habana Labs, and Hugging Face have continued to enhance efficiencies and lower obstacles to AI adoption over the last year through open-source initiatives, integrated developer experiences, and scientific research.

In a statement, Intel said the work has resulted in significant breakthroughs and efficiency in establishing and training high-quality transformer models.

Transformer models outperform various machine and deep learning tasks, including natural language processing (NLP), computer vision (CV), voice, and others. Training these deep learning models at scale necessitates a significant amount of computational power, making the process time-consuming, difficult, and expensive.

The goal of Intel’s continuous collaboration with Hugging Face, as part of the Intel Disruptor Program, is to increase adoption of training and inference solutions optimised for the latest Intel® Xeon® Scalable and Habana Gaudi® and Gaudi®2 CPUs. The cooperation brings the most advanced deep learning innovation from the Intel AI Toolkit to the Hugging Face open-source community and feeds future Intel® architectural innovation drivers. This research resulted in advances in distributed fine-tuning on Intel Xeon processors, built-in optimisations, rapid training with Habana Gaudi, and few-shot learning.

Distributed Fine-Tunning on Intel Xeon Platform

Data scientists use distributed training, where clustered servers each maintain a copy of the model, train it on a subset of the training dataset, and then exchange results across nodes via the Intel® oneAPI Collective Communications Library to converge to a final model more quickly when training on a single node CPU is too slow. Transformers now natively support this capability, which makes distributed fine-tuning for data scientists simpler.

One example is to use a distributed cluster of computers using Intel Xeon Scalable processors to speed up PyTorch training for transformer models. Intel created the Intel extension for PyTorch to take advantage of the hardware features offered by the most recent Intel Xeon Scalable processors, including Intel® Advanced Matrix Extensions (Intel® AMX), AVX-512, and Intel Vector Neural Network Instructions (VNNI). This software library offers automatic speedup for inference and training.

Hugging Face transformers also include a Trainer API, making it easy to begin training without writing a training loop from scratch. The Trainer supports different search backends, including Intel’s SigOpt, a hosted hyperparameter optimisation service, and provides an API for hyperparameter search. This allows data scientists to train and obtain the optimal model more efficiently.

Optimum Developer Experience

Hugging Face designed Optimum, an open-source library, to ease transformer acceleration across an increasing spectrum of training and inference devices. Beginners may use Optimum immediately, thanks to built-in optimisation algorithms and ready-made scripts, while professionals can keep modifying for maximum efficiency.

The Optimum Intel interface connects the transformers library to the various tools and libraries supplied by Intel to expedite end-to-end pipelines on Intel platforms. Built on top of the Intel® Neural Compressor, it provides a unified experience for popular network compression algorithms such as quantisation, pruning, and knowledge distillation. Furthermore, utilising the Optimum Intel library, developers may more easily execute post-training quantisation on a transformer model to compare model metrics on evaluation datasets.

Optimum Intel also provides a straightforward interface for optimising transformer models, converting them to OpenVINO intermediate representation format, and running OpenVINO inference.

Accelerated Training with Habana Gaudi

Hugging Face and Habana Labs are cooperating to make training large-scale, high-quality transformer models easier and faster. With a few lines of code, data scientists and machine learning engineers may speed transformer deep learning training with Habana processors – Gaudi and Gaudi2 – using Habana’s SynapseAI® software suite and the Hugging Face Optimum-Habana open-source module.

The Optimum-Habana library supports a range of computer vision, natural language, and multimodal models. BERT, AlBERT, DistilBERT, RoBERTa, Vision Transformer, swin, T5, GPT2, wav2vec2, and Stable-Diffusion are among the supported and tested model architectures. There are presently over 40,000 models based on these architectures accessible on the Hugging Face hub, which developers can easily enable on Gaudi and Gaudi2 using Optimum-Habana.

The cost-to-performance ratio of the Habana Gaudi solution, which powers Amazon’s EC2 DL1 instances, is up to 40 per cent better than comparable training solutions, allowing clients to train more while spending less. Gaudi2, which is based on the same high-efficiency architecture as the first-generation Gaudi, promises to achieve excellent pricing performance.

The Optimum-Habana package now includes Habana DeepSpeed, which allows it simple to configure and train big language models at scale on Gaudi devices utilising DeepSpeed optimisations. The Optimum-Habana DeepSpeed usage guide might help you learn more.

The current version of Optimum-Habana incorporates support for the Hugging Face diffusers library’s Stable Diffusion pipeline, providing the Hugging Face developer community with cost-effective test-to-image production on Habana Gaudi.

Few-shot Learning in Production

SetFit, a framework for few-shot fine-tuning of Sentence Transformers, was recently introduced by Intel Labs, Hugging Face, and UKP Lab. Few-shot learning using pre-trained language models has emerged as a promising solution to a real data scientist challenge: dealing with unlabeled data.

To translate instances into a format acceptable for the underlying language model, current solutions for few-shot fine-tuning require handmade prompts or verbalisers. SetFit eliminates prompts by immediately producing rich embeddings from a small number of tagged text examples.

Researchers created SetFit to work with any Sentence Transformer on the Hugging Face Hub, allowing text to be identified in many languages by fine-tuning a multilingual checkpoint.

Advertisement

MAPay to Create First 100 Million NFTs for Digital Health Records on the Algorand Blockchain

0
Image credit: MAPaycorp, Twitter
Media Release by MAPay

MAPay, a global healthcare technology firm with a focus on decentralized payment networks, unveiled its partnership with the Ministry of Public Health and Family Welfare in the Government of Maharashtra, India, to provide NFT technology that will store personal health data on the blockchain for the first time. Built on Algorand, the first deployment will introduce upwards of 100 million NFTs for this purpose.

MAPay will use its proprietary NFT technology to enable secure, decentralized storage. This application for NFTs will help eliminate intermediaries in the healthcare system that routinely cause bottlenecks, introduce risk, and drive up costs for all parties – including patients; public, private, and government health providers; insurance companies; and banks.

With key healthcare partnerships in the U.S. and beyond, MAPay will use the Algorand blockchain as the backbone for future advancements in the sector. In collaboration with leading organizations focused on advancing interoperability and payments in healthcare, MAPay seeks to implement a patient-driven data exchange that aids in population health management and better overall outcomes. The firm is working alongside large pharma, insurers, health systems, banks and governments.

“Our vision and passion is aligned with our global partners: to democratize healthcare. We want to truly transform Healthcare to Humancare,” said MAPay CEO Michael Dershem. “This use case is a perfect real-world application of blockchain technology. The impact on individuals and society as a whole is what we wake up to accomplish every day.”

The Algorand blockchain has experienced zero downtime since launch. Its sustainable, carbon-negative technology offers immediate transaction finality, consensus and network-level security, and advanced smart contracts built for scalability. The open, public infrastructure powers participation, transparency, and efficiency with real-world applications spanning many industries, including healthcare, finance, humanitarian aid, government, sports, entertainment, and more.

“As a clinician and a participant in shaping public health policy, I recognize the need for reliable integration and storage of clinical health records. This digital infrastructure is critical in building autonomous AI solutions, especially where accurate, accessible healthcare is not easily available,” said Dr. Sabine Kapasi, UN Health Policy and Governance Strategy lead. “This technology is transformational. Its transference and acceptance globally once seemed unimaginable; but now it is within reach.”

Advertisement

Accenture and Planet to Collaborate on AI-Powered Geospatial Intelligence Tools for Sustainability, Traceable Supply Chain and Climate Risk Solutions

0
Image credit: Accenture
Media Release by Accenture

Accenture, through its Accenture Ventures Project Spotlight initiative, has entered into a collaboration agreement with Planet Labs PBC, a leading provider of daily data and insights about Earth, to help power decision-making at organizations across myriad industries including agriculture, consumer packaged goods, energy, forestry and government.
 
By combining Planet’s high frequency satellite imagery data with Accenture’s broad array of sustainability services and deep industry and technology expertise, the two companies will collaborate on an array of sustainability and impact initiatives, including measurement, traceable supply chain strategy and data-based climate risk assessments to mitigate disruption across global value chains.
 
The collaboration with Planet is aligned with Accenture Ventures’ Project Spotlight, an engagement and early investment program that connects emerging technology startups with the Global 2000 to fill strategic innovation gaps.
 
“With its vast network of satellites, Planet offers one of the richest sources of daily, timely data about what is happening to our Earth — rich data that can yield incredibly valuable insights to help drive sustainability advancements and benefit industry and society as a whole,” said Tom Lounibos, managing director, Accenture Ventures. “With our organizations’ shared visions and values around sustainability as a key underpinning, we believe our collaboration with Planet will yield countless opportunities to combine our strengths and innovate on behalf of new and existing shared clients.”

Planet operates the world’s largest commercial fleet of approximately 200 earth imaging satellites that capture the majority of Earth’s landmass every day. Planet’s data is transforming the way companies and governments use satellite imagery data, delivering insights at the daily pace of change on earth.
 
Timely data derived from high-frequency satellite imagery can play a powerful role in driving numerous types of sustainability efforts ranging from energy transformation and sustainable agriculture to sustainable commodity sourcing and end-to-end supply chain tracking. The imagery can also yield valuable data to help financial services organizations identify market trends and risks tied to climate change, energy resources and commodity availability, while also helping public sector defense and intelligence organizations make more informed decisions pertaining to energy security, food security and other mission-critical or tactical decision making.
 
For example, Accenture’s Global H&PS AI/ESG teams and Accenture Federal Services’ Applied Intelligence Discovery Lab partnered with Planet using AIP+ (a managed service, multi-cloud environment for advanced AI and machine learning tools and applications, as well we data exchange for sustainability) to create a proof-of-concept for the Inter-American Development Bank. Using AI/machine learning models and Planet data, Accenture was able to create a geospatial solution that models the impact of climate change on energy capacity in Central America and the Caribbean. This was presented at the IDB Invest Sustainability Week conference in Miami in June 2022.
 
“Broadening the adoption of practical AI applications as an enabler of ESG risk and impact management is critical for IDB Invest,” said Leonardo Mazzei, Sustainability Sector lead at IDB Invest. “The proof-of-concept developed in partnership with Accenture is an example of how to leverage novel AI capabilities to accelerate sustainability-linked solutions, putting data-driven decision-making at the core of how businesses operate.”
 
“Accenture has demonstrated a deep commitment to sustainability and to helping its clients around the world utilize innovative technologies and new sources of data to help drive their sustainability efforts forward, so we view this as a very strategic collaboration,” said Kevin Weil, president, Product and Business at Planet. “In addition to opportunities to collaborate with many shared clients, we believe our participation in Accenture Ventures’ Project Spotlight program will help illustrate the benefits our data can deliver to organizations across different industry sectors.”
 
The Accenture and Planet collaboration builds on Accenture’s broader sustainability initiatives with a variety of clients, strategic alliance partners, and other international organizations, which includes very early-stage investments in key technologies and services that create value not just for clients but also its broader ecosystem partners, as sustainability continues to solidify as a top issue for all stakeholders.
 
“Working with Planet underscores Accenture’s commitment to space innovation and exploration as a sizable business opportunity for our clients,” added Paul Thomas, space innovation lead, Technology Innovation at Accenture. “From accelerating orbital science to facilitating secure commerce systems in space, we see tremendous benefits in space innovation for business, society, and the planet. In an uncertain world, technology advancement is the one certainty you can count on – today and in the future.”

Advertisement

AI-powered gardening app GardenMate wins 5th annual Call for Code

Image credit: IBM

Call for Code founding partner IBM, creator David Clark Cause, United Nations Human Rights, and program affiliate Linux foundation announced GardenMate as the winner of the 5th annual Call for Code Global Challenge.

The annual event invited innovators worldwide to assist in accelerating sustainability and combat climate change with open-source-powered technology.

GardenMate won the top prize this year for developing an app that uses IBM Watson® to link gardeners with excess produce to people in need.

IBM’s participation in Call for Code is an extension of the company’s ambition to better equip businesses with the skills needed to develop significant sustainability solutions. Developers may exploit open source-powered software such as Red Hat OpenShift, IBM Cloud, and IBM’s AI portfolio, including IBM Watson Assistant, by gaining access to premier hybrid cloud and AI technology. Since its inception, the challenge has attracted the involvement of 500,000 developers and problem solvers from 180 countries.

IBM Ecosystem general manager Kate Woolley said since the inaugural competition in 2018, IBM and Call for Code have empowered developers to tackle the world’s most pressing issues, such as combatting pollution and food inequality.

“This year’s Call for Code demonstrates the impact developers can have working with our Ecosystem partners to help affect change and create a better future through the use of technologies such as Hybrid Cloud and AI,” Woolley stated.

Winning 2022 Call for Code Global Challenge Solutions

GardenMate will get $200,000 USD in addition to assistance from the Linux Foundation in open-sourcing their application and deployment support from IBM Ecosystem partners. Four other sustainability solutions were also recognised:

  • The second prize and $25,000 were awarded to pπ, an AI-powered camera that monitors drainage and sewage canals.
  • Nearbuy, a shopping assistant that helps users find pre-loved things in their area, was awarded third place and $25,000.
  • ESSPERA, a machine learning-powered tool that can assist farmers in selecting the best seeds for the forthcoming growing season, took fourth place and $10,000.
  • SwachBIN, a machine learning algorithm that assists waste bins in classifying materials as trash or recycling, took fifth place and $10,000.

TransXEnergy was selected as the winner of the Call for Code University Edition, a cooperation between IBM and the Clinton Global Initiative University, by Chelsea Clinton, vice chair of The Clinton Foundation. TransXEnergy, an auction and blockchain-based peer-to-peer energy trading network developed by a team of Monash University Malaysia student developers, collects accessible energy data from sources such as electric vehicles and smart houses and connects buyers and sellers via a mobile application. TransXEnergy will receive a $15,000 award as the Call for Code University winner.

“I want to congratulate GardenMate for their remarkable innovation, and all the other problem solvers around the world who contributed their time and talent to help make Call for Code such a remarkable success,” said David Clark, Founder and CEO of Call for Code. “I also want to thank our Founding Partner IBM for their longstanding passion and commitment, along with our global partner United Nations Human Rights, The Linux Foundation, Clinton Global Initiative University, and all of our ecosystem partners who came together to empower developers to create sustainable solutions to help the most vulnerable among us, by creating software that helps to mitigate and adapt to the escalating effects of climate change.”

The winning solutions demonstrate IBM’s Environmental, Social, and Governance (ESG) strategy, which combines client, partner, and government collaboration with the use of technology such as Hybrid Cloud and AI to help build a more secure and equitable future.

IBM Ecosystem Partners and Open-Source Community Advance Call for Code Solutions

The IBM Ecosystem was instrumental in this year’s competition, providing subject matter experts to assist teams in advancing their solutions and encouraging employee participation. Partners such as Arrow Electronics, EY, Intuit, Persistent Systems, Ingram Micro, and New Relic also provided developer resources and technical experience to new initiatives.

“The annual Call for Code event is close to my heart. The event is a great way to empower and inspire our developers to create solutions that can make a difference to the world,” said Dr. Anand Deshpande, founder, chairman, and managing director, Persistent. “Call for Code is a part of our ESG strategy, and we encourage our employees to participate in thinking out-of-the-box and sharpening their problem-solving skills. By partnering with IBM, we work on solving the world’s most pressing business and societal issues for clients.”

The 2020 Call for Code Global Challenge winner Agrolly’s OpenTempus, which provides a yearly forecast of temperature and precipitation, and the 2021 Call for Code IBM Challenge winner OpenHarvest, which provides a machine learning-powered recommendation engine to help farmers manage their crops, are now two additional open source projects that the Linux Foundation oversees. Developers can now contribute to both of these projects.

Advertisement

How NVIDIA helped 3D researchers bring naval history to life

0
HMAS Sydney (II) in 1940. (Photo: Allan C. Green from the State Library of Victoria). Image credit: Medianet
Media Release by PR Deadlines

Museum goers will be able to explore two sunken WWII ships as if they were scuba divers on the ocean floor, thanks to work at Curtin University in Perth, Australia.

Exhibits in development, for display in Australia and potentially further afield, will use exquisitely detailed 3D models the researchers are creating to tell the story of one of the nation’s greatest naval battles. NVIDIA technology has a key role.

On November 19, 1941, Australia’s HMAS Sydney (II) and Germany’s HSK Kormoran lobbed hundreds of shells in a duel that lasted less than an hour. More than 700 died, including every sailor on the Sydney. Both ships sank to 8,000 feet, 130 miles off the coast of Western Australia, not to be discovered for decades.

Andrew Woods, an expert in stereoscopic 3D visualisation and associate professor at Curtin, built an underwater rig with more than a dozen video and still cameras to capture details of the wrecks in 2015.

Ash Doshi, a computer vision specialist and senior research officer at Curtin, is developing and running software on NVIDIA GPUs that stitches the half-million pictures and 300 hours of video they took into virtual and printed 3D models.

3D at battleship scale

It’s difficult pioneering work in a process called photogrammetry. Commercially available software maxes out at around 10,000 images.

“It’s highly computationally intensive — when you double the number of images, you quadruple the compute requirements,” said Woods, who manages the Curtin HIVE, a lab with four advanced visualisation systems.

“It would’ve taken a thousand years to process with our existing systems, even though they are fairly fast,” he said.

When completed next year, the work will have taken less than three years, thanks to systems at the nearby Pawsey Supercomputing Centre using NVIDIA V100 and prior-generation GPUs.

Speed enables iteration

Accelerated computing is critical because the work is iterative. Images must be processed, manipulated and then reprocessed.

For example, Woods said a first pass on a batch of 400 images would take 10 hours on his laptop. By contrast, he could run a first pass in 10 minutes on his system with two NVIDIA RTX A6000 GPUs awarded through NVIDIA’s Applied Research Accelerator Program.

It would take a month to process 8,000 images on the lab’s fast PCs, work the supercomputer could handle in a day. “Rarely would anyone in industry wait a month to process a dataset,” said Woods.

From films to VR

Local curators can’t wait to get the Sydney and Kormoran models on display. Half the comments on their Tripadvisor page already celebrate 3D films the team took of the wrecks.

The digital models will more deeply engage museumgoers with interactive virtual and augmented reality exhibits and large-scale 3D prints.

“These 3D models really help us unravel the story, so people can appreciate the history,” Woods said.

In a video call, Woods and Doshi show how forces embedded an anchor in the Kormoran’s hull as it sank.

The exhibits are expected to tour museums in Perth and Sydney, and potentially cities in Germany and the U.K., where the ships were built.

When the project is complete, the researchers aim to make their code available so others can turn historic artifacts on the seabed into rare museum pieces. Woods expects the software could also find commercial uses monitoring undersea pipelines, oil and gas rigs and more.

A real-time tool

On the horizon, the researchers want to try Instant NeRF, an inverse rendering tool NVIDIA researchers developed to turn 2D images into 3D models in real time.

Woods imagines using it on future shipwreck surveys, possibly running on an NVIDIA DGX System on the survey vessel. It could provide previews in near real time based on images gathered by remotely operated underwater vehicles on the ocean floor, letting the team know when it has enough data to take back for processing on a supercomputer.

“We really don’t want to return to base to find we’ve missed a spot,” said Woods.

Woods’ passion for 3D has its roots in the sea.

“I saw the movie Jaws 3D when I was a teenager, and the images of sharks exploding out of the screen are in part responsible for taking me down this path,” he said.

The researchers released the video here to commemorate the 81st anniversary of the sinking of the WWII ships.

Advertisement

Samsung Electronics and NAVER Team Up To Develop Semiconductor Solutions Optimized for Hyperscale AI

Image credit: Samsung Electronics
Media Release by Samsung Electronics

Samsung Electronics, the world leader in advanced memory technology, and NAVER Corporation, a global internet company with top-notch AI technology, today announced a wide-reaching collaboration to develop semiconductor solutions tailored for hyperscale artificial intelligence (AI) models. Leveraging Samsung’s next-generation memory technologies like computational storage, processing-in-memory (PIM) and processing-near-memory (PNM), as well as Compute Express Link (CXL), the companies intend to pool their hardware and software resources to dramatically accelerate the handling of massive AI workloads.

Recent advances in hyperscale AI have led to an exponential growth in data volumes that need to be processed. However, the performance and efficiency limitations of current computing systems pose significant challenges in meeting these heavy computational requirements, fueling the need for new AI-optimized semiconductor solutions.

Developing such solutions requires an extensive convergence of semiconductor and AI disciplines. Samsung is combining its semiconductor design and manufacturing expertise with NAVER’s experience in the development and verification of AI algorithms and AI-driven services, to create solutions that take the performance and power efficiency of large-scale AI to a new level.

For years, Samsung has been introducing memory and storage that support high-speed data processing in AI applications, from computational storage (SmartSSD) and PIM-enabled high bandwidth memory (HBM-PIM) to next-generation memory supporting the Compute Express Link (CXL) interface. Samsung will now join with NAVER to optimize these memory technologies in advancing large-scale AI systems.

NAVER will continue to refine HyperCLOVA, a hyperscale language model with over 200 billion parameters, while improving its compression algorithms to create a more simplified model that significantly increases computation efficiency.

“Through our collaboration with NAVER, we will develop cutting-edge semiconductor solutions to solve the memory bottleneck in large-scale AI systems,” said Jinman Han, Executive Vice President of Memory Global Sales & Marketing at Samsung Electronics. “With tailored solutions that reflect the most pressing needs of AI service providers and users, we are committed to broadening our market-leading memory lineup including computational storage, PIM and more, to fully accommodate the ever-increasing scale of data.”

“Combining our acquired knowledge and know-how from HyperCLOVA with Samsung’s semiconductor manufacturing prowess, we believe we can create an entirely new class of solutions that can better tackle the challenges of today’s AI technologies,” said Suk Geun Chung, Head of NAVER CLOVA CIC. “We look forward to broadening our AI capabilities and bolstering our edge in AI competitiveness through this strategic partnership.”

Advertisement

Tencent Cloud Joins Forces with BeLive Technology to Elevate Livestreaming Standard in Southeast Asia and Beyond

Image credit: Tencent Cloud

Tencent Cloud announced its partnership with BeLive Technology to redefine how people communicate online, enabling organisations to accelerate their business and drive additional revenue growth in the expanding video industry.

The collaboration intends to raise the bar in livestreaming throughout Southeast Asia and beyond by combining BeLive Technology’s years of expertise in livestreaming and video marketing across various industries and scales with Tencent Cloud Media Services’ one-stop audio and video solutions.

Businesses can use video livestreaming as a great marketing tool to engage with and interact with customers worldwide. BeLive Technology has served countless brands and organisations worldwide, with offices in Singapore and Vietnam and a developing global partner network. BeLive Technology-powered live streams have reached more than 100 million viewers worldwide, totalling 50 million hours of programming.

The low-latency, stable, and high-quality assistance Tencent Cloud Media Services provides enables BeLive Technology to expand the opportunities for businesses of all sizes to use livestreaming and interactive technology in their video marketing plans. This is also consistent with Tencent Cloud’s continuous efforts to achieve “Immersive Convergence” to incubate a new industrial environment and shape a new way of life.

The new partnership allows BeLive Technology to take advantage of Tencent Cloud Media Services’ extensive offering, which includes Stream Services, which include professional live streaming, real-time media processing, on-cloud recording, and simulcasting to other social media platforms; Video on Demand (VOD), which includes video storage, media asset management, and statistics analysis; and Tencent Real-Time Communication (TRTC), which emphasises real-time audio/video communication.

BeLive Technology has been able to leverage Tencent Cloud’s technological advantages and expertise to rapidly scale and enhance its capabilities – with continuous and consistent assistance from its marketing team and affordable pricing. Tencent Cloud Media Services has over 20 years of experience in the audio and video fields, in addition to the industry-leading livestreaming and video solutions.

Poshu Yeung, Senior Vice President, Tencent Cloud International, said, “We are pleased to join forces with BeLive Technology to provide interactive live entertainment that brings virtual events closer to the audience, taking livestreaming to a new level. The collaboration reinstates our ongoing efforts to fully realize ‘Immersive Convergence,’ driving new connections that integrate digital and physical forms. We are looking forward to supporting enterprises of all sizes with our high-quality and reliable services, ultimately integrating the digital economy with reality, and always staying one step ahead in this digital era.”

“BeLive Technology is committed to elevating the livestreaming standard in Southeast Asia. To achieve this feat as a service provider, we are pleased to be benefiting from Tencent Cloud’s high-quality, secure and highly reliable solutions that would help us support businesses from all walks of life,” BeLive Technology Co-founder and CEO Kenneth Tan stated.

To give organisations a one-stop media solution that helps build connections between businesses, users, developers, and everything else under the umbrella of “Immersive Convergence,” Tencent Cloud will continue to improve its audio and video technologies.

Advertisement