Web3

Home Web3 Page 161

Tigran Gambaryan Honored with Hero Award at DC Blockchain Summit for Courage and Commitment to Justice | Web3Wire

0
Tigran Gambaryan Honored with Hero Award at DC Blockchain Summit for Courage and Commitment to Justice | Web3Wire


The Digital Chamber proudly presented a special Hero Award to Tigran Gambaryan at the 2025 DC Blockchain Summit, recognizing his extraordinary courage, resilience, and unwavering commitment to combating illicit finance in the digital asset industry. The award was presented by U.S. Rep. Warren Davidson in a powerful moment that underscored the values of justice, freedom, and integrity. 

A former IRS Special Agent and current Binance executive, Tigran was unjustly detained in Nigeria for nearly a year in 2024 while working to promote compliance and responsible practices in the digital asset space. Despite enduring significant physical and emotional hardship, he remained steadfast in his principles — a powerful example of integrity in the face of adversity. 

“We are humbled to honor Tigran Gambaryan with the Hero Award,” said Perianne Boring, Founder and CEO of The Digital Chamber. “His life’s work — both in law enforcement and in the private sector — has made this industry safer and more accountable. His unjust detention highlighted the very real risks faced by those who uphold the rule of law. Today, we celebrate his strength, his service, and his return.” 

Congressman Davidson, a longtime advocate for civil liberties and financial innovation, delivered remarks before presenting the award and shared this statement: 

“I am proud to recognize Tigran Gambaryan – a U.S. citizen – for his resilience and bravery in the face of his unjust detention in Nigeria. His release is a relief, but it never should have happened—no American should be used as leverage by a foreign government. I’m honored to present him with the Hero Award and to welcome him home.” 

“I’m deeply honored to receive this award and want to sincerely thank The Digital Chamber, the U.S. government, my incredible wife Yuki, and the countless friends—both personal and professional—who worked tirelessly to bring me home. Your support carried me through the darkest days. I hope what happened to me never happens to another compliance professional. No one should be punished for doing the right thing.” 

Tigran’s career — from leading complex investigations at the IRS to building global compliance programs in the private sector — has left a lasting impact on the digital asset industry. As this space continues to evolve, The Digital Chamber remains committed to honoring those who lead with integrity and ensuring that no one who stands up for what’s right ever stands alone. 

About The Digital Chamber The Digital Chamber is the world’s leading trade association representing blockchain and digital asset innovators. Founded in 2014, the organization has shaped national policy, defended the industry during its most challenging periods, and secured bipartisan support for blockchain innovation. Today, The Digital Chamber is building the future of the digital economy through education, advocacy, and strategic engagement in Washington and around the world. 

About Web3Wire Web3Wire – Information, news, press releases, events and research articles about Web3, Metaverse, Blockchain, Artificial Intelligence, Cryptocurrencies, Decentralized Finance, NFTs and Gaming. Visit Web3Wire for Web3 News and Events, Block3Wire for the latest Blockchain news and Meta3Wire to stay updated with Metaverse News.



Source link

The Future of AI Agents: Why Event-Driven Architecture Is Essential

0
The Future of AI Agents: Why Event-Driven Architecture Is Essential


AI agents are poised to revolutionize enterprise operations through autonomous problem-solving capabilities, adaptive workflows, and unprecedented scalability. However, the central challenge in realizing this potential isn’t simply developing more sophisticated AI models. Rather, it lies in establishing robust infrastructure that enables agents to access data, utilize tools, and communicate across systems seamlessly.

Beyond Better Models: An Infrastructure Challenge

The true hurdle in advancing AI agents isn’t an AI problem per se—it’s fundamentally an infrastructure and data interoperability challenge. For AI agents to function effectively, they need more than sequential command executions; they require a dynamic architecture that facilitates continuous data flow and interoperability. As aptly noted by HubSpot CTO Dharmesh Shah, “Agents are the new apps,” highlighting the paradigm shift in how we conceptualize AI systems.

To fully understand why event-driven architecture (EDA) represents the optimal solution for scaling agent systems, we must examine how artificial intelligence has evolved through distinct phases, each with their own capabilities and limitations.

The Evolution of Artificial Intelligence

First Wave: Predictive Models

The initial wave of AI centered on traditional machine learning approaches that produced predictive capabilities for narrowly defined tasks. These models required considerable expertise to develop, as they were custom-crafted for specific use cases with their domain specificity embedded in the training data.

This approach created inherently rigid systems that proved difficult to repurpose. Adapting such models to new domains typically meant building from scratch—a resource-intensive process that inhibited scalability and slowed adoption across enterprises.

Second Wave: Generative Models

The second wave brought generative AI to the forefront, powered by advances in deep learning. Unlike their predecessors, these models were trained on vast, diverse datasets, enabling them to generalize across various contexts. The ability to generate text, images, and even videos opened exciting new application possibilities.

However, generative models brought their own set of challenges:

They remain fixed in time, unable to incorporate new or dynamic information

Adapting them to specific domains requires expensive, error-prone fine-tuning processes

Fine-tuning demands extensive data, significant computational resources, and specialized ML expertise

Since large language models (LLMs) train on publicly available data, they lack access to domain-specific information, limiting their accuracy for context-dependent queries

For example, asking a generative model to recommend an insurance policy based on personal health history, location, and financial goals reveals these limitations. The model can only provide generic or potentially inaccurate responses without access to relevant user data.

The Compound AI Bridge

Compound AI systems emerged to overcome these limitations, integrating generative models with additional components such as programmatic logic, data retrieval mechanisms, and validation layers. This modular approach enables AI to combine tools, fetch relevant data, and customize outputs in ways static models cannot.

Retrieval-Augmented Generation (RAG) exemplifies this approach by dynamically incorporating relevant data into the model’s workflow. While RAG effectively handles many tasks, it relies on predefined workflows—every interaction and execution path must be established in advance. This rigidity makes RAG impractical for complex or dynamic tasks where workflows cannot be exhaustively encoded.

The Third Wave: Agentic AI Systems

We’re now witnessing the emergence of the third wave of AI: agentic systems. This evolution comes as we reach the limitations of fixed systems and even advanced LLMs.

Google’s Gemini reportedly failed to meet internal expectations despite being trained on larger datasets. Similar challenges have been reported with OpenAI’s next-generation Orion model. Salesforce CEO Marc Benioff recently stated on “The Wall Street Journal’s “Future of Everything” podcast that we’ve reached the upper limits of what LLMs can achieve, suggesting that autonomous agents—systems capable of independent thinking, adaptation, and action—represent the future of AI.

Agents introduce a critical innovation: dynamic, context-driven workflows. Unlike fixed pathways, agentic systems determine next steps adaptively, making them ideal for addressing the unpredictable, interconnected challenges that businesses face today.

This approach fundamentally inverts traditional control logic. Rather than rigid programs dictating every action, agents leverage LLMs to drive decisions. They can reason, utilize tools, and access memory—all dynamically. This flexibility allows for workflows that evolve in real-time, making agents substantially more powerful than systems built on fixed logic.

Design Patterns That Empower Agents

AI agents derive their effectiveness not only from core capabilities but also from the design patterns that structure their workflows and interactions. These patterns enable agents to address complex problems, adapt to changing environments, and collaborate effectively.

Reflection: Self-Improvement Through Evaluation

Reflection allows agents to evaluate their own decisions and improve outputs before taking action or providing final responses. This capability enables agents to identify and correct mistakes, refine reasoning processes, and ensure higher-quality outcomes.

Tool Use: Expanding Capabilities

Integrating with external tools extends an agent’s functionality, allowing it to perform tasks like data retrieval, process automation, or execution of deterministic workflows. This capability is particularly valuable for operations requiring strict accuracy, such as mathematical calculations or database queries. Tool use effectively bridges the gap between flexible decision-making and predictable, reliable execution.

Planning: Transforming Goals Into Action

Agents with planning capabilities can decompose high-level objectives into actionable steps, organizing tasks in logical sequences. This design pattern proves crucial for solving multi-step problems or managing workflows with dependencies.

Multi-Agent Collaboration: Modular Problem-Solving

Multi-agent systems take a modular approach by assigning specific tasks to specialized agents. This approach offers flexibility—smaller language models (SLMs) can be deployed for task-specific agents to improve efficiency and simplify memory management. The modular design reduces complexity for individual agents by focusing their context on specific tasks.

A related technique, Mixture-of-Experts (MoE), employs specialized submodels or “experts” within a unified framework. Like multi-agent collaboration, MoE dynamically routes tasks to the most relevant expert, optimizing computational resources and enhancing performance. Both approaches emphasize modularity and specialization—whether through multiple agents working independently or through task-specific routing in a unified model.

Agentic RAG: Adaptive and Context-Aware Retrieval

Agentic RAG evolves traditional RAG by making it more dynamic and context-driven. Instead of relying on fixed workflows, agents determine in real-time what data they need, where to find it, and how to refine queries based on the task. This flexibility makes agentic RAG well-suited for handling complex, multi-step workflows requiring responsiveness and adaptability.

For instance, an agent creating a marketing strategy might begin by extracting customer data from a CRM, use APIs to gather market trends, and refine its approach as new information emerges. By maintaining context through memory and iterating on queries, the agent produces more accurate and relevant outputs. Agentic RAG effectively combines retrieval, reasoning, and action capabilities.

The Challenge of Scaling Intelligent Agents

Scaling agents—whether individual or collaborative systems—fundamentally depends on their ability to access and share data effortlessly. Agents must gather information from multiple sources, including other agents, tools, and external systems, to make informed decisions and take appropriate actions.

Connecting agents to necessary tools and data represents a distributed systems challenge, similar to those faced when designing microservices architectures. Components must communicate efficiently without creating bottlenecks or rigid dependencies.

Like microservices, agents must communicate effectively and ensure their outputs provide value across broader systems. Their outputs shouldn’t merely loop back into AI applications—they should flow into critical enterprise systems like data warehouses, CRMs, customer data platforms, and customer success platforms.

While agents and tools could connect through RPC calls and APIs, this approach creates tightly coupled systems. Tight coupling inhibits scalability, adaptability, and support for multiple consumers of the same data. Agents require flexibility, with outputs seamlessly feeding into other agents, services, and platforms without establishing rigid dependencies.

The solution? Loose coupling through event-driven architecture—the essential foundation that enables agents to share information, respond in real-time, and integrate with broader ecosystems without the complications of tight coupling.

Event-Driven Architecture: A Foundation for Modern Systems

In computing’s early days, software systems existed as monoliths—everything contained within a single, tightly integrated codebase. While initially simple to build, monoliths became problematic as they grew.

Scaling represented a blunt instrument: entire applications required scaling, even when only specific components needed additional resources. This inefficiency led to bloated systems and brittle architectures incapable of accommodating growth.

Microservices transformed this paradigm by decomposing applications into smaller, independently deployable components. Teams could scale and update specific elements without affecting entire systems. However, this created a new challenge: establishing effective communication between distributed services.

Connecting services through direct RPC or API calls creates complex interdependencies. When one service fails, it impacts all nodes along connected paths, creating cascading failures.

Event-driven architecture (EDA) resolved this problem by enabling asynchronous communication through events. Services don’t wait for each other—they react to real-time occurrences. This approach enhances system resilience and adaptability, allowing them to manage the complexity of modern workflows. EDA represents not just a technical improvement but a strategic necessity for systems under pressure.

Learning from History: The Rise and Fall of Early Social Platforms

The trajectories of early social networks like Friendster provide instructive lessons about scalable architecture. Friendster initially attracted massive user bases but ultimately failed because their systems couldn’t handle the increasing demand. Performance issues drove users away, leading to the platform’s demise.

Conversely, Facebook thrived not merely because of its features but because it invested in scalable infrastructure. Rather than collapsing under success, it expanded to market dominance.

Today, we face similar potential outcomes with AI agents. Like early social networks, agents will experience rapid adoption. Building capable agents isn’t sufficient—the critical question is whether your underlying architecture can manage the complexity of distributed data, tool integrations, and multi-agent collaboration. Without proper foundations, agent systems risk failure similar to early social media casualties.

Why AI’s Future Depends on Event-Driven Agents

The future of AI transcends building smarter agents—it requires creating systems that evolve and scale as technology advances. With AI stacks and underlying models changing rapidly, rigid designs quickly become innovation barriers. Meeting these challenges demands architectures prioritizing flexibility, adaptability, and seamless integration. EDA provides this foundation, enabling agents to thrive in dynamic environments while maintaining resilience and scalability.

Agents as Microservices with Informational Dependencies

Agents resemble microservices: autonomous, decoupled, and capable of independent task execution. However, agents extend beyond typical microservices.

While microservices typically process discrete operations, agents depend on shared, context-rich information for reasoning, decision-making, and collaboration. This creates unique requirements for managing dependencies and ensuring real-time data flows.

An agent might simultaneously access customer data from a CRM, analyze live analytics, and utilize external tools—all while sharing updates with other agents. These interactions require systems where agents operate independently while exchanging critical information seamlessly.

EDA addresses this challenge by functioning as a “central nervous system” for data. It allows agents to broadcast events asynchronously, ensuring dynamic information flow without creating rigid dependencies. This decoupling enables agents to operate autonomously while integrating effectively into broader workflows and systems.

Maintaining Context While Decoupling Components

Building flexible systems doesn’t sacrifice contextual awareness. Traditional, tightly coupled designs often bind workflows to specific pipelines or technologies, forcing teams to navigate bottlenecks and dependencies. Changes in one system area affect the entire ecosystem, impeding innovation and scaling efforts.

EDA eliminates these constraints through workflow decoupling and asynchronous communication, allowing different stack components—agents, data sources, tools, and application layers—to function independently.

In today’s AI stack, MLOps teams manage pipelines like RAG, data scientists select models, and application developers build interfaces and backends. Tightly coupled designs force unnecessary interdependencies between these teams, slowing delivery and complicating adaptation as new tools emerge.

Event-driven systems ensure workflows remain loosely coupled, allowing independent innovation across teams. Application layers don’t need to understand AI internals—they simply consume results when needed. This decoupling also ensures AI insights extend beyond silos, enabling agent outputs to integrate seamlessly with CRMs, CDPs, analytics tools, and other systems.

Scaling Agents with Event-Driven Architecture

EDA forms the backbone for transitioning to agentic systems, with its ability to decouple workflows while facilitating real-time communication ensuring efficient agent operation at scale. Platforms like Kafka exemplify EDA advantages in agent-driven systems:

Horizontal Scalability: Kafka’s distributed design supports adding new agents or consumers without bottlenecks, enabling effortless system growth

Low Latency: Real-time event processing allows agents to respond instantly to changes, ensuring fast, reliable workflows

Loose Coupling: Communication through Kafka topics rather than direct dependencies keeps agents independent and scalable

Event Persistence: Durable message storage guarantees data preservation during transit, critical for high-reliability workflows

Data streaming enables continuous information flow throughout organizations. A central nervous system serves as the unified backbone for real-time data transmission, connecting disparate systems, applications, and data sources to facilitate efficient agent communication and decision-making.

This architecture aligns naturally with frameworks like Anthropic’s Model Context Protocol (MCP), which provides a universal standard for integrating AI systems with external tools, data sources, and applications. By simplifying these connections, MCP reduces development effort while enabling context-aware decision-making.

EDA addresses many challenges that MCP aims to solve, including seamless access to diverse data sources, real-time responsiveness, and scalability for complex multi-agent workflows. By decoupling systems and enabling asynchronous communication, EDA simplifies integration and ensures agents can consume and produce events without rigid dependencies.

Conclusion: Event-Driven Agents Will Define AI’s Future

The AI landscape continues evolving rapidly, and architectures must adapt accordingly. Businesses appear ready for this transition—a Forum Ventures survey found 48% of senior IT leaders prepared to integrate AI agents into operations, with 33% indicating they’re very prepared. This demonstrates clear demand for systems capable of scaling and managing complexity.

EDA represents the key to building agent systems that combine flexibility, resilience, and scalability. It decouples components, enables real-time workflows, and ensures agents integrate seamlessly into broader ecosystems.

Organizations adopting EDA won’t merely survive—they’ll gain competitive advantages in this new wave of AI innovation. Those who fail to embrace this approach risk becoming casualties of their inability to scale, much like the early social platforms that collapsed under their own success. As AI agents become increasingly central to enterprise operations, the foundations we build today will determine which systems thrive tomorrow.



Source link

Chronicle Secures $12 Million to Drive Blockchain Excellence in Tokenized Asset Management – Web3oclock

0
Chronicle Secures  Million to Drive Blockchain Excellence in Tokenized Asset Management – Web3oclock


Driving Tokenized Asset Adoption with Verified Data:

Expanding Infrastructure and Compliance Measures:

Addressing the Growing Demand for Tokenized Assets:

Strategic Vision for the Future:

About Chronicle:



Source link

Maiven Gains €1.7 Million to Transform Climate Policy Tracking Through AI – Web3oclock

0
Maiven Gains €1.7 Million to Transform Climate Policy Tracking Through AI – Web3oclock


Addressing Climate Policy Challenges with AI:

Strategic Partnership with Climate Impact Partners

Real-Time Monitoring and Automated Alerts:

About Maiven:



Source link

Elizabeth Warren Calls Stablecoin Bill a Trump and Musk ‘Grift’ – Decrypt

0
Elizabeth Warren Calls Stablecoin Bill a Trump and Musk ‘Grift’ – Decrypt



On Wednesday, U.S. Senator Elizabeth Warren (D-MA) took aim at a stablecoin-focused bill making its way through Congress, accusing U.S. President Donald Trump of using the legislation to further his own financial interests.

The Massachusetts senator shared the critique while linking a post about President Trump’s decentralized finance project, World Liberty Financial, launching its own stablecoin USD1 on Ethereum and Binance’s BNB Chain.

Warren voiced her concerns on social media, claiming President Trump is leveraging the project as a “grift” to “enrich” himself. 

“Congress should step up and fix the current stablecoin bill moving through the Senate that will make it easier for Trump—and Elon Musk—to take control of your money,” Warren wrote, criticizing the “Financial Innovation and Technology for the 21st Century Act” (FIT21) bill.

The FIT21 bill seeks to create a clear regulatory framework for digital assets, with U.S. Rep. French Hill (R-AR) mentioning that in the “next few days,” legislators will roll out a revised bill.

In the meantime, the Trump administration is moving to make the U.S. the “crypto capital of the world” through a series of initiatives, including the creation of a SEC Task Force dedicated to overseeing digital asset regulations. 

President Trump also called for “simple, common-sense rules for stablecoins and market structure” during a video call at the Blockworks crypto conference in New York last Thursday.

The stablecoin market currently holds over $238 billion in circulation, as per CoinGecko data, with Tether (USDT) making up a significant portion. 

Trump’s crypto czar, David Sacks, has previously promised to introduce legislation on stablecoins and market structures within the first 100 days of Trump’s second term.

The Massachusetts senator recently challenged Sacks to prove he’s not “directly profiting off of the Trump Administration’s efforts to selectively pump the value of certain crypto assets,” as he claimed he sold all his crypto assets before beginning in his role as crypto czar.

Elon Musk’s influence within the government, particularly through his role in the Department of Government Efficiency (DOGE), has only fueled Warren’s concerns. 

The initiative, which Musk heads, aims to reduce government bureaucracy and eliminate excess regulations, but it has faced criticism for potentially giving Musk—and by extension, his business interests—a disproportionate amount of influence over U.S. financial policy.

In January, Warren lambasted DOGE in a letter to the Dogecoin aficionado, accusing the department of being a potential “venue for corruption.” 

The senator’s letter to Musk suggested a range of changes, including cracking down on tax loopholes for the wealthy and reforming government contracts to cut wasteful spending.

Edited by Sebastian Sinclair

Daily Debrief Newsletter

Start every day with the top news stories right now, plus original features, a podcast, videos and more.



Source link

The UX overhaul blockchain needs to reach a billion users

0
The UX overhaul blockchain needs to reach a billion users


The following is a guest post from Susannah Evans, IBC Product Lead at Interchain Foundation.

The future of the internet is shaping up to be promising, and there is no doubt that blockchain and Web3 technologies have been at the forefront of this innovation, promising decentralization, security, and financial sovereignty. However, despite all its advancements, mass adoption of the technology still remains elusive. The primary culprit? A poor user experience. Even though interoperability protocols have improved significantly, the process of moving assets and interacting across multiple chains still remains too complex for institutional and everyday users.

The recent Cross-Chain Interoperability Report 2024 highlights that the biggest challenge to adoption is the high friction users face when they navigate blockchain ecosystems. As of today, users need to manage multiple wallets, manually sign numerous transactions, and navigate complexities when identifying the optimal route for transferring assets between chains. These inefficiencies have forced users into ecosystem silos rather than encouraging them to explore more cross-chain interactions.

When comparing the Web2 experience to that of Web3, the difference is night and day. Take traditional financial transactions as an example. There is still a lot of complexity in Web2, but Web2 is just better at hiding it, meaning users can navigate this space without thinking about the backend. For instance, when sending money through a payment app, users need not bother about bank settlement layers, messaging protocols, or verifying different clearing mechanisms. Web3, by comparison, places too much of this burden on users, making it essential for them to understand the backend and thereby forcing them to deal with intricate transaction approvals, security risks involved, and token management systems. This has been fine to date for an audience of crypto enthusiasts keen to understand the tech on which they operate. However, as the Web3 ecosystem looks to scale to a global user base, the industry must rethink this user experience to captivate the attention of the layperson that has no prior dealings with blockchain.

Interoperability’s growing pains – what’s stopping Web3 from going mainstream?

There is no denying that interoperability is solving some of the technical limitations of blockchain technology. However, for users, the experience still leaves much to be desired. Recent data indicated that over 85 million people worldwide use blockchain wallets. However, despite the growing adoption, the challenge of wallet fragmentation remains a glaring issue. Unlike in Web2, where a single login provides access to multiple services, blockchain requires users to maintain different wallets for different ecosystems. This makes cross-chain interactions painstaking, as the experience of switching between multiple wallets is neither intuitive nor seamless.

Managing wallets across chains continues to remain a major point of friction for users. While transaction batching has reduced the burden of multi-signing, users still often need to switch wallets when interacting across different blockchains. This process is not only painstaking but also increases the likelihood of human errors—such as approving the wrong contract accidentally or sending assets to an incorrect address—leading to a potential loss of funds. Seamless interoperability should mean users can move assets and interact across chains without needing to constantly switch wallets or navigate complex approval processes that are still manual.

Security concerns complicate the case for Web3 adoption further. With an aggregate of $2.7 billion lost in cross-chain bridge exploits from July 2021 till Aug 2024 alone, it should come as no surprise that many users hesitate to move assets across blockchains due to fear of hacks or transaction failures. When a single mistake can result in permanent asset loss, it comes as no surprise that even experienced users remain cautious when engaging in cross-chain transactions. While significant strides have been made in addressing these challenges, it is essential that interoperability solutions factor in differences among chains to build trust and ensure security, reliability, and a seamless experience for everyday users.

Solver-based bridging: A new approach to UX

One of the emerging solutions to blockchain’s user experience crisis is intent-centric/solver-based bridging protocols. Acting as a form of chain abstraction, these protocols operate on an “intent” or specific goal that a user wishes to accomplish within a chain—for example, swapping tokens between two chains without the need to navigate the cross-chain complexities themselves. Instead of having to select a bridge, sign multiple transactions manually, and then monitor the process until the transaction is complete, users are simply required to define their intent, and automated solvers execute the action in the most efficient way possible. Intent-based chain abstraction solutions are becoming an increasingly popular architecture, with many component-based products potentially coming together like puzzle pieces to gradually shape the final form of chain abstraction.

For example, if a user wants to exchange ETH on Ethereum for USDC on Solana, a solver-based protocol has the capability to identify the best route, align all the necessary approvals, and then complete the transaction—all this without the user being required to make any technical decisions. This drastically reduces the high level of friction users face and improves security by minimizing errors due to manual interventions.

Intent-centric/solver-based bridging protocols aren’t just about simplifying transactions; they are also about making Web3 interactions feel as smooth as traditional Web2 experiences. With these solver-based protocols handling tasks like route optimization and execution, users no longer need to worry about the underlying infrastructure as they simply get their desired result.

Making the Web3 backend invisible: Are chain abstraction and ZKPs the solution?

For Web3 to reach a stage of mass adoption, the underlying complexities that users must currently navigate need to be eliminated. While solver-based bridging protocols improve cross-chain interoperability, chain abstraction and zero-knowledge proofs can be implemented in many other ways to make the overall Web3 UX better. While chain abstraction makes blockchain interactions feel seamless, allowing everyday users to engage with dApps without worrying about the underlying infrastructure, zero-knowledge proofs (ZKPs) enable the verification of information without revealing the information itself, giving individuals and organizations assurance that their information is safe. These technologies eliminate the need for users to switch networks, bridge assets, or manage different token standards. Additionally, these advancements move blockchain technology beyond just technical innovation and into a system that simply works well. If it wasn’t evident already, it should be by now that the most successful technology isn’t the most complex—rather, it’s the one people don’t even realize they’re using. This is reflected in the popularity of these technologies, which are already gaining traction.

The Web3 industry has spent years and significant resources looking for solutions to improve scalability, security, and interoperability along with building trust. It is now time to bring into sharp focus the evolving needs of users and make this pathbreaking technology accessible to everyday users. If the Web3 ecosystem truly wants to onboard the next billion users, it is time the user experience becomes a key priority and the focus shifts from just building infrastructure.

It can be said in no clearer words—user experience is the key to mainstream adoption. Solutions like solver-based bridging protocols, chain abstraction, and zero-knowledge proofs represent a fundamental shift in how users are beginning to interact with various blockchains. By prioritizing these innovations, the Web3 ecosystem is on a path where the future of Web3 becomes as seamless as what we all have come to expect with Web2. After all, a billion users won’t adopt blockchain technology because of what it can do—it will only see mainstream adoption when individuals can engage with it without even thinking about it.

Mentioned in this article

XRP Turbo



Source link

Nvidia GTC2025: 100x more computation needed

0
Nvidia GTC2025: 100x more computation needed


Look at this word cloud. It’s not just a colorful visualization – it’s the pulse of our technological future captured in a single image. The words that dominated Jensen Huang’s GTC 2025 keynote tell a story that should make every technologist, investor, and futurist sit up straight.

“GPU” “AI.” “Computing.” “Factory.” “Token.” These aren’t just buzzwords – they’re the vocabulary of a revolution unfolding in real time.

And then Jensen dropped the bombshell that sent shockwaves across the industry:

We need 100x more compute

“The scaling law, this last year, is where almost the entire world got it wrong.The computation requirement, the scaling law of AI is more resilient and in fact, hyper accelerated. The amount of computation we need at this point is easily 100 times more than we thought we needed this time last year.”

Let that sink in. Not 20% more. Not double. One hundred times more compute than anticipated just twelve months ago.

Remember when we thought AI was advancing fast? Turns out, we were dramatically underestimating the compute hunger of truly intelligent systems. This isn’t gradual evolution – it’s a sudden, dramatic reimagining of what our infrastructure needs to become.

Why? Because AI has learned to think and act.

Jensen illustrated this with a seemingly simple problem – organizing a wedding seating chart while accommodating family feuds, photography angles, and traditional constraints. A llama3.1 tackled it with a quick 439 tokens, confidently serving up the wrong answer. But a deepseek – the reasoning model? It generated over 8,000 tokens, methodically thinking through approaches, checking constraints, and testing solutions.

This is the difference between an AI that simply responds and one that truly reasons. And that reasoning requires exponentially more computational horsepower.

What does this mean for the industry?

If you’re building AI applications, your infrastructure roadmap just changed dramatically. If you’re investing in tech, the winners will be those who can solve this compute challenge. And if you’re watching from the sidelines, prepare to witness a massive transformation of our digital landscape.

The hunt for 100X compute isn’t just NVIDIA’s problem – it’s the defining challenge for the entire tech ecosystem. And how we respond will reshape industries, markets, and possibly society itself.

The question isn’t whether we need to scale dramatically – it’s how we’ll achieve this scale in ways that are practical, sustainable, and accessible to more than just the tech giants.

The race for the next generation of compute has officially begun. And the stakes couldn’t be higher.

Data Centres will be power limited

While Jensen’s 100X revelation left the audience stunned, it was his description of how computing itself is changing that truly illuminates the path forward.

“Every single data center in the future will be power limited.The revenues are power limited.”

This isn’t just a technical constraint – it’s an economic reality that’s reshaping the entire compute landscape. When your ability to generate value is directly capped by how much power you can access and efficiently use, the game changes completely.

The traditional approach? Build bigger data centers. But as Jensen pointed out, we’re approaching a trillion-dollar datacenter buildout globally – a staggering investment that still won’t satisfy our exponentially growing compute demands, especially with these new power constraints.

This is where the industry finds itself at a crossroads, quietly exploring alternative paths that could complement the traditional centralized model.

What if the solution isn’t just building more massive data centers, but also harnessing the vast ocean of underutilized compute that already exists? What if we could tap into even a fraction of the idle processing power sitting in devices worldwide?

Jensen himself hinted at this direction when discussing the transition from retrieval to generative computing:

“Generative AI fundamentally changed how computing is done. From a retrieval computing model, we now have a generative computing model.”

This shift doesn’t just apply to how AI generates responses – it can extend to how we generate and allocate compute resources themselves.

At Spheron,we are exploring precisely this frontier – envisioning a world where compute becomes programmable, decentralized, and accessible through permissionless protocol. Rather than just building more centralized factories, our approach aims to create fluid marketplaces where compute can flow to where it’s needed most.

Agents,Agents & Agents

Jensen didn’t just talk about more powerful hardware – he laid out a vision for a fundamentally new kind of AI:

“Agenetic AI basically means that you have an AI that has agency. It can perceive and understand the context of the circumstance. It can reason, very importantly, can reason about how to answer or how to solve a problem and it can plan an action. It can plan and take action.”

These agentic systems don’t just respond to prompts; they navigate the world, make decisions, and execute plans autonomously.

“There’s a billion knowledge workers in the world. They’re probably going to be 10 billion digital workers working with us side-by-side.”

Supporting 10 billion digital workers requires not just computational power, but computational independence – infrastructure that allows these digital workers to acquire and manage their own resources.

An agent that can reason, plan, and act still hits a wall if it can’t secure the computational resources it needs without human intervention.

As Jensen’s presentation made clear, we’re building AIs that can think, reason, and act with increasingly human-like capabilities. But unlike humans, most of these AIs can’t independently acquire the resources they need to function. They remain dependent on API keys, cloud accounts, and payment methods controlled by humans.

Solving this requires more than just powerful hardware – it demands new infrastructure models designed specifically for agent autonomy. This is where Spheron’s programmable infrastructure comes into play where agents can directly lease compute resources through smart contracts without human intermediation.

New approach to increase efficiency

As Jensen guided us through his roadmap for the next generation of AI hardware, he revealed a fundamental truth that transcends mere technical specifications:

“In a data center, we could save tens of megawatts. Let’s say 10 megawatts, well, let’s say 60 megawatts, 60 megawatts is 10 rubin ultra racks… 100 rubin ultra racks of power that we can now deploy into rubins.”

This isn’t just about efficiency – it’s about the compute economics that will govern the AI era. In this world, every watt saved translates directly into computational potential. Energy isn’t just an operating expense; it’s the fundamental limiting factor on what’s possible.

When the computational ceiling is determined by power constraints rather than hardware availability, the economics of AI shift dramatically.

The question becomes not just “How much compute can we build?” but “How can we extract maximum value from every available watt?”

While NVIDIA focuses on squeezing more computation from each watt through better hardware design, we have designed a complementary approach that tackles the problem from a different angle.

What if, instead of just making each processor more efficient, we could more efficiently utilize all the processors that already exist?

This is where decentralized physical infrastructure models(DePIN) like Spheron find its economic rationale ensuring that no computational potential goes to waste.

The numbers tell a compelling story.At any given moment,compute worth more than $500B sit idle or underutilized across millions of powerful GPUs in data centres,gaming PCs, workstations, and small server clusters worldwide which are. Even harnessing a fraction of this latent compute power could significantly expand our collective AI capabilities without requiring additional energy investment.

The new compute economics isn’t just about making chips more efficient – it’s about ensuring that every available chip is working on the most valuable problems.

What lies ahead

The 100X computation requirement isn’t just a technical challenge – it’s an invitation to reimagine our entire approach to infrastructure. It’s pushing us to invent new ways of scaling, new methods of allocation, and new models for access that extend far beyond traditional data center paradigms.

The word cloud we began with captures not just the keywords of Jensen’s keynote, but the vocabulary of this emerging future – a world where “scale,” “AI,” “token,” “factory,” and “compute” converge to create possibilities we’re only beginning to imagine.

As Jensen himself put it: “This is the way to solve this problem is to disaggregate… But as a result, we have done the ultimate scale up. This is the most extreme scale up the world has ever done.”

The next phase of this journey will involve not just scaling up, but scaling out – extending computational capacity across new types of infrastructure, new access models, and new autonomous systems that can manage their own resources.

We’re not just witnessing an evolution in computation, but a revolution in how computation is organized, accessed, and deployed.And in that revolution lies perhaps the greatest opportunity of our technological era – the chance to build systems that don’t just augment human capability, but fundamentally transform what’s possible at the intersection of human and machine intelligence.

The future will require not just better hardware, but smarter infrastructure that’s as programmable, as flexible, and ultimately as autonomous as the AI systems it powers.

That’s the true horizon of possibility that emerged from GTC 2025 – not just more powerful chips, but a fundamentally new relationship between computation and intelligence that will reshape our technological landscape for decades to come.



Source link

Request Finance Surpasses $1 Billion in Bill Payments, Secures Strategic Funding to Scale Stablecoin and Fiat Finance – Web3oclock

0
Request Finance Surpasses  Billion in Bill Payments, Secures Strategic Funding to Scale Stablecoin and Fiat Finance – Web3oclock


Transforming Crypto and Fiat Finance –

Driving Growth Through Strategic Acquisitions and Compliance –

A Defining Moment for Global Crypto Adoption –

About Request Finance –



Source link

5G Chipset Market Size Is Projected To Reach $92.05 Billion By 2030 | Web3Wire

0
5G Chipset Market Size Is Projected To Reach .05 Billion By 2030 | Web3Wire


Allied Market Research published an exclusive report, titled, “5G Chipset Market Size, Share, Competitive Landscape and Trend Analysis Report by IC Type, Operational Frequency, Product and Industry Vertical : Opportunity Analysis and Industry Forecast, 2021-2030”.

The 5G chipset market size was valued at $13.26 billion in 2020, and is projected to reach $92.05 billion by 2030, registering a CAGR of 21.8% from 2021 to 2030.

Download Research Report Sample & TOC : www.alliedmarketresearch.com/request-sample/5114

5G network is an enhanced communication solution designed to deliver to the public, a fully connected mobile world, comprising everything from connected automobiles and smart cities to smartphones and internet of things (IoT) devices. Further, reefshark chipset solutions such as application-specific integrated circuits offer higher voltages, greater performance, and reduction in footprint/bill off materials. In addition, the rise in utilization of 5G network solutions across emerging economies is anticipated to offer significant growth opportunities for the market.

The growth of the 5G chipset market is majorly driven by the rise in demand for high-speed internet and large network coverage coupled with proliferation of M2M/IoT connections. Furthermore, increase in demand for mobile broadband services is anticipated to drive the market growth. However, privacy and security concern, high investment, and technological & infrastructure challenges in the implementation of 5G network act as a prime restraint of the market. On the contrary, surge in government initiatives for building smart cities in Asia-Pacific is anticipated to provide lucrative opportunities for the expansion of the 5G chipset industry during the forecast period.

Key Market Players:The 5G chipset size report offers an in-depth analysis of the 10 prime market players that are active in the market.

Moreover, it provides their thorough financial analysis, business strategies, SWOT profile, business overview, and recently launched products & services. In addition, the report offers recent market developments such as market expansion, mergers & acquisitions, and partnerships & collaborations. The prime market players studied in the report are Qualcomm Technologies, Inc., Broadcom, Intel Corporation, Nokia Corporation, Samsung Electronics Co., Ltd., Mediatek Inc., Xilinx Inc., Huawei Technologies Co., Ltd., Qorvo, and Infineon Technologies AG.

Request For Customization @ https://www.alliedmarketresearch.com/request-for-customization/5114

Segmentation Analysis:The 5G chipset market is segmented by IC type, operational frequency, product, industry vertical, and region. The report offers an in-depth study of every segment, which helps market players and stakeholders to understand the fastest growing segments and highest grossing segments in the market.

The 5G chipset is analyzed across the globe and highlight several factors that affect the performance of the market across the various region including North America (United States, Canada, and Mexico), Europe (Germany, France, UK, Russia, and Italy), Asia-Pacific (China, Japan, Korea, India, and Southeast Asia), South America (Brazil, Argentina, Colombia), Middle East and Africa (Saudi Arabia, UAE, Egypt, Nigeria, and South Africa).

The 5G chipset report provides thorough information about prime end-users and annual forecast during the period from 2022 to 2030. Moreover, it offers revenue forecast for every year coupled with sales growth of the market. The forecasts are provided by skilled analysts in the market and after an in-depth analysis of the geography of the market. These forecasts are essential for gaining insight into the future prospects of the industrial cooking fire protection system industry.

Key Findings Of The Study:

In 2020, the devices segment accounted for maximum revenue, and is projected to grow at a notable CAGR of 21.5% during the forecast period.

The consumer electronics and automotive and transportation segments together accounted for around 74.6% of the 5G chipset market trends in 2020.

The mmWave IC segment is projected to growth at a CAGR of 24.1% during the 5G chipset market forecast period.

Asia-Pacific contributed for the major share in the market, accounting for more than 35.5% share in 2020.

The research operandi of the global 5G chipset includes significant primary as well as secondary research. When the primary methodology encompasses widespread discussion with a plethora of valued participants, the secondary research involves a substantial amount of product/service descriptions. Furthermore, several government sites, industry bulletins, and press releases have also been properly examined to bring forth high-value industry insights.

Inquiry Before Buying : https://www.alliedmarketresearch.com/purchase-enquiry/5114

Read More Reports :

https://www.alliedmarketresearch.com/optical-detector-market-A16497

https://www.alliedmarketresearch.com/fluid-sensors-market-A16493

https://www.alliedmarketresearch.com/eo-ir-gimbal-market-A06283

Contact:David Correa1209 Orange Street,Corporation Trust Center,Wilmington, New Castle,Delaware 19801 USA.Int’l: +1-503-894-6022Toll Free: +1-800-792-5285Fax: +1-800-792-5285help@alliedmarketresearch.comWeb:https://www.alliedmarketresearch.com

About us :

Allied Market Research (AMR) is a full-service market research and business-consulting wing of Allied Analytics LLP based in Wilmington, Delaware. Allied Market Research provides global enterprises as well as medium and small businesses with unmatched quality of “Market Research Reports” and “Business Intelligence Solutions.” AMR has a targeted view to provide business insights and consulting to assist its clients to make strategic business decisions and achieve sustainable growth in their respective market domain.

We are in professional corporate relations with various companies, and this helps us in digging out market data that helps us generate accurate research data tables and confirms utmost accuracy in our market forecasting. Each and every data presented in the reports published by us is extracted through primary interviews with top officials from leading companies of domain concerned. Our secondary data procurement methodology includes deep online and offline research and discussion with knowledgeable professionals and analysts in the industry.

This release was published on openPR.

About Web3Wire Web3Wire – Information, news, press releases, events and research articles about Web3, Metaverse, Blockchain, Artificial Intelligence, Cryptocurrencies, Decentralized Finance, NFTs and Gaming. Visit Web3Wire for Web3 News and Events, Block3Wire for the latest Blockchain news and Meta3Wire to stay updated with Metaverse News.



Source link

UK Startup Rival Secures $4.2 Million to Launch 3D Content Creation Platform – Web3oclock

0
UK Startup Rival Secures .2 Million to Launch 3D Content Creation Platform – Web3oclock




Source link

Popular Posts

My Favorites