Web3

Home Web3 Page 128

Solana, XRP and Dogecoin ETF Approvals in 2025 Are a Near Lock, Analysts Say – Decrypt

0
Solana, XRP and Dogecoin ETF Approvals in 2025 Are a Near Lock, Analysts Say – Decrypt



In brief

Top analysts are near-certain that numerous crypto spot ETF applications, including Dogecoin, Solana, and XRP, will be approved by year’s end.
James Seyffart, an ETF analyst at Bloomberg, said the approvals could come next month or by the late fall—but that regardless, the question at hand is now “when not if.”
Other altcoin ETFs expected to begin trading on Wall Street include Litecoin, Cardano, Polkadot, and Avalanche.

Two top Wall Street analysts are confident many top altcoins ETFs will imminently be approved for trading—so confident, they’ve now estimated the likelihood of such spot approvals coming before the end of the year at almost 100%.  

Solana, XRP, and Litecoin spot ETFs are near-locks at 95% odds of approval from the U.S. Securities and Exchange Commission by the end of 2025, the analysts, Eric Balchunas and James Seyffart of Bloomberg, wrote Friday. 

Dogecoin, Cardano, Polkadot, Hedera, and Avalanche spot ETF applications are also sitting quite pretty, according to the analysts, with 90% chance of approval by year end.

If the above altcoin ETF applications receive an SEC green light in the coming months, then the development would mark a substantial milestone in the history of Wall Street. Thus far, the agency has approved only two categories of crypto spot ETFs: Bitcoin and Ethereum.

The success of those funds has spurred additional demand for crypto-focused ETFs and other related investment products. Spot Bitcoin ETFs now manage well over $100 billion in assets, with BlackRock’s iShares Bitcoin Trust (IBIT) reaching $70 billion in AUM faster than any fund in history, based on company data. 

Crypto’s two top tokens have long been considered to belong to a league of their own in terms of legitimacy, stability, and staying power, and even their approval for mainstream trading was no easy feat

Among the current batch of contenders for spot ETF trading are tokens that have significantly smaller market values and less established reputations than Bitcoin and Ethereum. 



Dogecoin, for instance, is the world’s first meme coin; Avalanche is the native token of a network that boasts less than 2% of the total value locked on Ethereum. DOT, the native token of the Polkadot blockchain, boasts a market capitalization of just $5.2 billion, compared to $293 billion for ETH and $2.06 trillion for BTC, according to data provider CoinGecko.

Should spot ETFs of such altcoins begin trading on Wall Street, that would mean that traditional financial institutions and retail investors would be able to gain direct exposure to the tokens, which have historically been volatile. Issuers of spot ETFs actually buy and store the cryptocurrencies represented by the financial products on behalf of clients. 

Ric Edelman, founder of the Digital Assets Council of Financial Professionals, told Decrypt it was a foregone conclusion that crypto ETFs would explode as soon as President Donald Trump, who campaigned avidly as a pro-crypto candidate, was reelected last fall. 

“It is regarded as inevitable that we’ll see many other single-asset and multi-asset ETFs of digital coins and tokens,” Edelman said. “The Bitcoin and Ethereum ETFs will prove to have been merely the first.”

“And all that’s just the start,” he continued. “Tokenization is underway and once all assets are tokenized, there will be thousands of ETFs, or their tokenized equivalents, launched. It’ll be the biggest explosion of investment opportunities ever.”

The Bloomberg analysts’ confidence that the SEC may soon approve so many crypto ETFs beyond BTC and ETH stems in part from the agency’s openness to engage with requests to list them in recent months—requesting updated details and public comments on numerous applications.

The applications have been filed by several Wall Street firms, ranging from crypto-centric investment managers like Grayscale to TradFi stalwarts including Fidelity and Franklin Templeton. 

“Engagement from the SEC is a very positive sign in our opinion,” Bloomberg’s Seyffart said. 

Another factor that has likely increased the odds of imminent spot ETF approvals for the altcoins in question is the fact that, in recent months, the CFTC has approved futures markets for all of them. Futures ETFs track the prices of derivatives contracts for assets, but do not involve the actual buying or selling of the underlying asset.

While the Bloomberg analysts are confident that altcoin spot ETFs will garner approvals before the end of the year, the exact timing remains uncertain. Seyffart said they could come in the next month, or perhaps not until the late fall—but that at this point, the question is a “matter of when not if.” 

Brian Rudick, chief strategy office at Upexi, a publicly traded Solana-focused treasury company, told Decrypt that while ETF approvals for certain altcoins with lower trading volume may not necessarily result in immediately higher demand for those tokens, Wall Street debuts could have a dramatic price impact on more popular tokens like Solana.

“While demand for ETFs on long-tail alts may not materialize, ETFs based on top assets like Solana will likely see strong inflows and may act as a large positive catalyst for the price of the underlying token,” Rudick said. “Indeed, the spot ETFs were the main reason the price of Bitcoin more than doubled from when BlackRock applied for a spot Bitcoin ETF in mid-2023 through the exceptional inflows over the first six months after launch.”

Edited by James Rubin

Daily Debrief Newsletter

Start every day with the top news stories right now, plus original features, a podcast, videos and more.



Source link

THIEAUDIO Monarch MKIV: Flagship-Level Detail Meets Bone-Rattling Bass-custom-Tuned at Your Fingertips | Web3Wire

0
THIEAUDIO Monarch MKIV: Flagship-Level Detail Meets Bone-Rattling Bass-custom-Tuned at Your Fingertips | Web3Wire


NEW YORK CITY, NY / ACCESS Newswire / June 20, 2025 / THIEAUDIO, a global leader in high-fidelity audio innovation, proudly announces the release of the Monarch MKIV, the latest generation of its legendary Monarch series. Combining technical mastery with user-focused customization, the Monarch MKIV introduces a new era of flexibility, refinement, and acoustic power for audiophiles.

For the first time in Thieaudio’s Signature Series lineup, the Monarch MKIV comes with an adjustable tuning switch. Extensive R&D has led to a functional tuning switch that is stylish, easy to use without tools, and has a significant impact on the tuning.

An All-New Tuning ProfileThe Monarch MKIV encompasses both tradition and modernity in its tuning profiles.

STANDARD mode returns to a studio-neutral sound, with 10dB of impactful sub-bass and a sharp 150Hz cutoff for fast, clean bass and a neutral midrange. Treble is crisper and more natural than the previous generation, focusing on vocal clarity while smoothing out excessive upper harmonics.

RUMBLE mode loosens the low-mid crossover for fuller bass and thicker mids. Sub-bass is boosted to 13dB, while the 200-500Hz range gains 2dB, enhancing body and texture in bass guitar, male vocals, and instruments-ideal for those craving a richer, more powerful sound.

IMPACT² Dual Subwoofer SystemAt the heart of the MKIV lies IMPACT², THIEAUDIO’s proprietary isobaric subwoofer module. Using dual 8mm dynamic drivers arranged in a pressure-controlled chamber, IMPACT² delivers fast, punchy, and textured bass that enhances musical depth without overwhelming the mids. This unique design offers a sub-bass response that is both impactful and clean, ideal for bass lovers who still crave tonal accuracy.

Refined Treble with Redesigned EST DriversTo deliver airy highs with greater finesse, the Monarch MKIV incorporates redesigned Sonion electrostatic (EST) drivers, extending treble performance up to 40kHz. A completely re-engineered crossover allows for smoother transitions from balanced armatures to ESTs, removing unwanted harshness and improving treble timbre. The result is a spacious, natural soundstage with heightened vocal clarity and realistic instrument separation.

Premium Grade Aluminum Alloy ShellThe MKIV debuts a CNC-milled Grade 5 aluminum alloy shell, offering superior durability, corrosion resistance, and scratch protection. Despite its robust build, the housing is featherlight and ergonomic, ensuring fatigue-free wear for long listening sessions. It’s a blend of form and function designed for audiophiles on the move.

Next-Generation Modular CableThe Monarch MKIV also includes THIEAUDIO’s latest modular cable system, crafted from high-purity silver-plated and oxygen-free copper wires. With interchangeable 3.5mm and 4.4mm plugs included.

Technical DetailsDriver:2DD (8mm) + 6BA + 2EST (Sonion E50) + 1 Sonion 28UAPCrossover:4-way crossover with 4 sound tubes+2 ultra-high frequency drivers+4 high frequency drivers+2 mid frequency drivers+2 low frequency driversFrequency Response:10Hz – 44kHzImpedance:10/9 ohms (±1 ohm) @1kHzSensitivity:100dB ±1dB @1kHzTotal Harmonic Distortion (THD):<1% @1kHz

About THIEAUDIOTHIEAUDIO was founded by passionate engineers and DIY audiophiles with a mission to bring flagship audio performance to all. Known for delivering world-class sound at honest prices, the Monarch MKIV embodies this vision, built for those who demand excellence without compromise.

The THIEAUDIO Monarch MKIV is now available for purchase on Thieaudio website and Linsoul website.

THIEAUDIO Website

Linsoul Website

SOURCE: LINSOUL INC

About Web3Wire Web3Wire – Information, news, press releases, events and research articles about Web3, Metaverse, Blockchain, Artificial Intelligence, Cryptocurrencies, Decentralized Finance, NFTs and Gaming. Visit Web3Wire for Web3 News and Events, Block3Wire for the latest Blockchain news and Meta3Wire to stay updated with Metaverse News.



Source link

How U.S. Policy Can Support Tokenization

0
How U.S. Policy Can Support Tokenization


Tokenization is becoming an important part of how financial markets evolve. By representing real-world assets as tokens on public blockchains, institutions can create more efficient, transparent, and accessible systems for transferring value.

Across the United States, financial firms, infrastructure providers, and policymakers are exploring how tokenized assets could fit into the broader market structure. The technical foundation is already being used to support stablecoins, tokenized Treasuries, funds, and other instruments. The next step is ensuring the regulatory environment is equipped to support this transition.

This post identifies three core regulatory challenges facing tokenization in the U.S. and outlines practical steps policymakers can take to address them.

 

Three Core Blockers Holding Back U.S. Tokenization

 

Challenge #1: How Are Tokenized Assets Classified?  

One of the most persistent sources of regulatory uncertainty in tokenization is the lack of consistent legal classification. U.S. law does not yet offer a consistent taxonomy for digital assets. As a result, these assets are frequently subject to case-by-case interpretation. A fiat-backed stablecoin, for example, could be considered a payment instrument, a stored-value product, a security, a fund, or a bank deposit depending on how it is structured and who is reviewing it. Many issuers have chosen to avoid paying interest or implementing yield features precisely to avoid securities classification.

Tokenized Treasury products face similar challenges. While U.S. Treasuries themselves are exempt from SEC registration, packaging them into a pooled tokenized product could trigger the Investment Company Act. In other cases, the presence of yield or fractionalization could lead regulators to treat the token as a security in its own right.This lack of definitional clarity forces companies to rely on legal opinions and conservative product design choices to avoid regulatory risk. It also undermines the ability of policymakers to craft targeted rules, since the foundational question of classification remains unsettled. Until U.S. regulators agree on consistent categories for tokenized assets, and define them in law, the market will continue to operate within a gray zone.

Challenge #2: What Standards Guide Interoperability?

Tokenization is built on the idea that digital assets can move across systems—between chains, platforms, and financial institutions—with the same ease and reliability as data on the internet. Technically, that vision is already being realized. Cross-chain interoperability protocols like Chainlink CCIP make it possible to transfer tokenized assets across different blockchains and systems.

While the infrastructure is advancing, the policy foundation requires more development. There is no clear regulatory framework in the U.S. that explains how compliance obligations apply when a tokenized asset moves across systems. Questions around custody, transfer restrictions, investor protections, and compliance responsibilities are often unresolved once an asset leaves its original environment.

For example, when a tokenized fund is transferred from one chain to another, it is not always clear whether the receiving environment must meet the same licensing or custodial standards. Institutions may hesitate to interact with assets across chains if they cannot verify how regulatory responsibilities carry over. This uncertainty reduces confidence, fragments liquidity, and limits the broader functionality of tokenized markets.

Challenge #3: What Is Preventing Broader Consumer Access?  

Tokenization is often described as a way to broaden participation in financial markets by lowering access barriers and embedding trust into financial products. Yet today, most U.S. consumers have limited access to tokenized assets through the platforms they already use.

One major reason is that regulated tokenized products are often restricted to private offerings or gated to accredited investors. Complex and fragmented licensing requirements, such as state-by-state money transmitter rules, broker-dealer registration, or the need for specialized trust charters, make it difficult for most consumer-facing platforms to launch and scale tokenized products.

This creates a two-tier system. Institutional investors and high-net-worth individuals are gaining early access to tokenized markets, while retail users are left on the sidelines. Without clear regulatory pathways for broad consumer distribution, many platforms focus only on permissioned or offshore use cases. 

There is also a gap in public understanding. Many consumers do not know what tokenized assets are, how they differ from traditional products, or how features like proof of reserves, automated compliance, or 24/7 liquidity can benefit them. Without clear regulatory pathways and accessible examples in the market, broader familiarity and trust have been slower to develop.

 

How U.S. Policy Can Clear the Path for Tokenization

 

Solution 1: Define what tokenized assets are and what they are not

Much of the legal uncertainty around tokenization comes down to the absence of clear, consistent definitions. Without a shared taxonomy for digital financial instruments, developers, institutions, and regulators are left interpreting how 20th-century laws apply to 21st-century products. This ambiguity leads to cautious product design, risk-averse legal positioning, and inconsistent treatment across agencies.

Headway is being made in this area with the GENIUS Act of 2025, now moving through the Senate, which proposes a statutory framework for fiat-backed stablecoins. It explicitly states that properly structured stablecoins are not securities, helping issuers and users operate with more confidence. Similar definitional clarity is needed across other categories, including tokenized Treasuries, funds, and real-world assets.

Emerging drafts of the next major market structure bill are expected to take a more comprehensive approach. Rather than forcing tokenized products into categories like “security” or “commodity,” these proposals aim to define digital assets based on their function, structure, and risk profile. Clear definitions for tokenized assets would give the entire industry a firmer legal foundation to build on and allow regulators to apply rules more consistently.

Solution 2: Develop Interoperability Policy Standards

Today, U.S. regulation does not explain how obligations like custody, transfer restrictions, or investor protections carry over in a cross-chain or cross-platform context. This creates friction for institutions that need certainty before they can operate across networks. Many choose to keep assets siloed within closed environments where legal responsibilities are easier to manage.

The GENIUS Act takes an important step by directing regulators to establish interoperability standards for payment stablecoins. But these standards are limited in scope. Additional guidance is needed for other tokenized assets, including Treasuries, funds, and real-world assets.

Policymakers can close this gap by developing regulatory frameworks that recognize how compliance obligations travel with assets across systems. This could involve coordinated rulemaking, joint agency guidance, or structured pilot programs that allow firms to test interoperable use cases under clear supervisory expectations.

A clear set of interoperability standards would allow firms to build for real-world use cases with confidence, ensuring that tokenized assets are not only technically portable but legally usable across the systems where they are needed most.

Solution 3: Create the Conditions for Widespread Consumer Access

Expanding consumer access to tokenized assets will require clearer rules for how these products can be offered to the public in a safe and compliant way. While interest is growing, many providers remain limited by regulatory structures that were not built with tokenized finance in mind.

Policymakers have an opportunity to reduce these barriers by developing frameworks that support broader retail participation without compromising trust or oversight. This could include refining licensing pathways for platforms that offer tokenized products, clarifying which types of assets are appropriate for general use, and establishing consistent standards for disclosures, custody, and investor protection.

These changes would give providers greater confidence to offer tokenized assets to the public and would help consumers better understand the products available to them. Education, transparency, and responsible distribution all play a role in ensuring that tokenization can serve everyday users, not just institutions.

Conclusion

Tokenization offers a once-in-a-generation opportunity to modernize financial markets. The technology is already in place. The demand from institutions is real. What’s missing is a regulatory environment that makes it possible to build and scale with confidence.

Rather than reinventing the system, the U.S. can move forward by doing three things well: assigning clear regulatory responsibility, defining digital assets with legal precision, and creating a workable path for tokenized products to reach the market. Legislative proposals like the GENIUS Act, updated market structure bills, and the Tokenization Report Act point in the right direction. Now it’s a matter of execution.

With the right legal framework, the U.S. can lead globally in building trusted, secure, and scalable markets for tokenized assets.



Source link

SEC Thailand Opens Public Consultation on Crypto Listing Criteria – Decrypt

0
SEC Thailand Opens Public Consultation on Crypto Listing Criteria – Decrypt



In brief

Thailand’s SEC is seeking public input on new listing rules for digital assets, with a consultation open until July 21, 2025.
Proposed rules would allow exchanges to list self-issued tokens and require disclosures to prevent insider trading.
The move aligns with Thailand’s broader push to become a global crypto hub, following recent tax exemptions and regulatory reforms.

Thailand’s Securities and Exchange Commission has opened public consultation on revising criteria for digital asset listings on exchanges, seeking to align regulations with industry developments while “maintaining investor protections.”

The SEC announced on Friday that it is seeking feedback on principles to improve the selection process for digital assets on “Digital Asset Exchanges,” with the consultation period running until July 21, 2025.

“The SEC Board, at its June 2025 meeting, resolved to revise the criteria for selecting digital assets to be provided on the exchange to be in line with the context of the digital asset industry,” the SEC said in a statement.

The proposed changes would allow exchanges to list “ready-to-use digital tokens or cryptocurrencies” issued by the exchange itself or related parties for blockchain transactions. 

The move aims to provide digital assets that are “consistent with the development of innovation and usage,” while promoting Thailand’s digital asset ecosystem, according to the regulator’s announcement.

Monitoring for warning signs

Under the proposed framework, exchanges must disclose the names of persons related to digital token issuers and display warning symbols in reporting systems to help the SEC monitor and prevent insider trading. 

The regulator called for maintaining “regulatory mechanisms for preventing and managing conflicts of interest, preventing market manipulation of digital assets, and preventing unfair practices.”



For tokens already listed before the announcement takes effect, issuers have 90 days to provide related-party disclosures to exchanges.

The consultation marks another step in Thailand’s strategy to capture international crypto businesses and position itself as a regional financial center.

The country recently eliminated capital gains taxes on crypto sales for five years in a Cabinet decision, with the government projecting the initiative will generate economic benefits “by no less than 1,000 million baht” ($30.7 million) over the medium term.

Deputy Finance Minister Julapun Amornvivat called the tax exemption part of the government’s ambition to establish Thailand as “one of the world’s financial hubs.”

Thailand is also preparing pilot programs for crypto tourism payments in Phuket and considering allowing spot Bitcoin ETFs for retail investors

In January, SEC Secretary-General Pornanong Budsaratragoon said Thailand must “move along with more adoption of cryptocurrencies worldwide.”

Edited by Sebastian Sinclair

Daily Debrief Newsletter

Start every day with the top news stories right now, plus original features, a podcast, videos and more.



Source link

Edge AI Market Analysis: Key Players, Forecast & Innovation Impact Till 2031 | Japan Technology | Web3Wire

0
Edge AI Market Analysis: Key Players, Forecast & Innovation Impact Till 2031 | Japan Technology | Web3Wire


Edge AI Market

Global Edge AI Market reached US$ 16.8 Billion in 2023 and is expected to reach US$ 73.8 Billion by 2031, growing with a CAGR of 20.6% during the forecast period 2024-2031.

Latest News in Japan:

Tokyo-based EdgeCortix lands US$21 million NEDO grantEdgeCortix secured a 3 billion ¥ (≈US$21 million) award from Japan’s NEDO to develop its NovaEdge energy-efficient chiplet for edge AI inference and learning

Japanese edge AI startup scores U.S. Defense Innovation Unit deal

In a milestone move for the industry, EdgeCortix became the first Japanese chip company to win a U.S. defense contract, showcasing its AI‐edge capabilities for military applicationsedgecortix.com

Mitsubishi Electric launches manufacturing-specific edge-device language model

With its Maisart® brand, Mitsubishi Electric unveiled a compact, data-augmented LLM capable of running on edge devices-ideal for on-premises factory automation solutionsmitsubishielectric.com

ARBOR showcases water‐and‐dust‐resistant Edge AI hardware at Japan IT Week

ARBOR Technology demonstrated rugged, IP65-rated Edge AI systems like Ruby 10 and ARES‐1983H, designed for harsh industrial environments at Tokyo’s Japan IT Week Spring 2025

Unlock exclusive insights with our detailed sample report (Corporate Email ID gets priority access): https://datamintelligence.com/download-sample/edge-ai-market?kb

Edge AI Market Overview

Edge AI (Edge Artificial Intelligence) refers to the deployment of AI algorithms directly on edge devices-such as sensors, cameras, smartphones, industrial machines, or gateways without needing constant connection to centralized cloud servers. It enables real-time data processing, decision-making, and analysis at or near the data source.

This localized processing reduces latency, enhances data privacy, and ensures faster response times, making Edge AI ideal for use cases like autonomous vehicles, smart manufacturing, remote healthcare, surveillance systems, and IoT applications.

Research Methodology:

The global Edge AI Market research report is based on a combination of primary and secondary data sources. The study carefully analyzes various factors influencing the industry, including government regulations, market dynamics, competitive landscape, historical trends, current conditions, and technological progress. It also evaluates future developments, related industry trends, market volatility, growth prospects, potential obstacles, and emerging challenges.

Segment Covered in the Edge AI Market:

By Component: Hardware, Software, Edge Cloud Infrastructure, Services

By Technology: Machine Learning (Deep Learning, Machine Learning Models), Computer Vision, Natural Language Processing, Predictive Analytics

By End-User: Consumer Electronics, Manufacturing, Automotive, Government, Healthcare, Energy, Healthcare, Others

Speak to Our Analyst and Get Customization in the report as per your requirements: https://datamintelligence.com/customize/edge-ai-market?kb

List of the Key Players in the Edge AI Market:

ADLINK Technology Inc., Alphabet Inc., Amazon.com, Inc., Gorilla Technology Group, Intel Corporation, International Business Machines Corporation, Microsoft Corporation, Nutanix, Inc. Synaptics Incorporated and Viso.ai.

Industry Development:

On February 1, 2024, Advantech introduced the MIC-715-OX, a ruggedized edge AI system designed specifically for heavy industrial environments. Built to withstand harsh conditions, the system features a durable, shock- and vibration-resistant enclosure, an IP67 rating for protection against water and dust, and a fanless thermal design that ensures reliable performance without drawing in contaminated air.

On January 10, 2024, Ambarella, Inc. unveiled the Cooper Developer Platform, a cutting-edge solution that integrates hardware, software, finely tuned AI models, and essential services into a unified system. Designed to support Ambarella’s full suite of AI SoCs, Cooper streamlines development and deployment across edge AI applications.

Regional Analysis for Edge AI Market:

⇥ North America (U.S., Canada, Mexico)

⇥ Europe (U.K., Italy, Germany, Russia, France, Spain, The Netherlands and Rest of Europe)

⇥ Asia-Pacific (India, Japan, China, South Korea, Australia, Indonesia Rest of Asia Pacific)

⇥ South America (Colombia, Brazil, Argentina, Rest of South America)

⇥ Middle East & Africa (Saudi Arabia, U.A.E., South Africa, Rest of Middle East & Africa)

The Report Includes:

➡ A descriptive analysis of demand-supply gap, market size estimation, SWOT analysis, PESTEL Analysis and forecast in the global market.

➡ Top-down and bottom-up approach for regional analysis

➡ Go-to-market Strategy.

➡ Neutral perspective on the market performance.

➡ Customized regional/country reports as per request and country level analysis.

➡ Potential & niche segments and regions exhibiting promising growth covered.

People Also Ask:

➠ What are the global sales, production, consumption, imports, and exports in the Edge AI market?

➠ Who are the top manufacturers, and what are their capacity, production, sales, pricing, and revenue stats?

➠ What key opportunities and challenges do vendors face in the Edge AI industry?

➠ Which product types, applications, or end-users are driving market growth, and what is their market share?

➠ What are the major growth drivers and restraints of the Edge AI market?

Stay informed with the latest industry insights-start your subscription now: https://www.datamintelligence.com/reports-subscription?kb

Contact Us –

Company Name: DataM IntelligenceContact Person: Sai KiranEmail: Sai.k@datamintelligence.comPhone: +1 877 441 4866Website: https://www.datamintelligence.com

About Us –

DataM Intelligence is a Market Research and Consulting firm that provides end-to-end business solutions to organizations from Research to Consulting. We, at DataM Intelligence, leverage our top trademark trends, insights and developments to emancipate swift and astute solutions to clients like you. We encompass a multitude of syndicate reports and customized reports with a robust methodology.

Our research database features countless statistics and in-depth analyses across a wide range of 6300+ reports in 40+ domains creating business solutions for more than 200+ companies across 50+ countries; catering to the key business research needs that influence the growth trajectory of our vast clientele.

This release was published on openPR.

About Web3Wire Web3Wire – Information, news, press releases, events and research articles about Web3, Metaverse, Blockchain, Artificial Intelligence, Cryptocurrencies, Decentralized Finance, NFTs and Gaming. Visit Web3Wire for Web3 News and Events, Block3Wire for the latest Blockchain news and Meta3Wire to stay updated with Metaverse News.



Source link

ESA Predicts Humans Living in ‘Space Oases’ on Mars in 2040 – Decrypt

0
ESA Predicts Humans Living in ‘Space Oases’ on Mars in 2040 – Decrypt



In brief

Technology 2040 Vision envisages humans living in “autonomous habitats beyond Earth.”
The destinations could include the moon, Mars, and beyond.
In a tweet, ESA Director-General Josef Aschbacher billed the document as a “call to action.”

The European Space Agency (ESA) has outlined expectations for humans living on other planets as soon as 2040 in a newly published document.

Technology 2040 Vision envisages humans living in “autonomous habitats beyond Earth” on the Moon, Mars and beyond, following what it predicts will be a “rapid evolution of technology” in the coming years.

In a tweet, ESA Director-General Josef Aschbacher billed the document as a “call to action,” as part of a roadmap to building a “resilient European presence across Earth orbit and beyond.”

The document was released to clarify what space exploration—and habitation—could look like in the near future. The next steps in human exploration beyond Earth’s orbit “will involve longer stays and farther destinations,” its authors wrote.

This new wave of exploration will be underpinned by “space oases,” self-sustaining habitats that will protect astronauts using “circular management of resources.” The result should enable astronauts to spend far longer in space compared to current missions that are limited to around six months at most, the authors added.

Future space habitats will make use of “smart materials” and “in-situ manufacturing,” while supplies will be delivered using “high-velocity logistics” and technology such as mass drivers.

Doing all this while keeping the environmental impact to a minimum will pose challenges, the report’s authors noted. “Achieving true sustainability requires the kind of circular thinking increasingly seen on Earth,” with a “holistic” approach to using resources.

“The ability to repurpose and recycle materials in orbit is not only key to sustainability but will also enable new markets and capabilities and add additional commercial value to space assets,” the report said, building on predictions from ESA Director of Space Technology, Engineering and Quality at ESTEC Dietmar Pilz that the global space economy could be worth as much as €1 trillion by 2040.

Communications will also be improved, with “optical communications links” and “relay spacecraft” enabling “trunk lines” carrying communications and network data streams as far as Saturn.

The ESA predicted that AI and quantum technologies would play large roles in the leap forward, with smart materials, long-sustainability and modular payloads all expected to advance in the coming years.

Nevertheless, with this huge endeavour to achieve—and a recent SpaceX rocket test failure demonstrating that space travel is still fraught with risks—2040 may sound a little closer than expected.

Edited by Stephen Graves

Generally Intelligent Newsletter

A weekly AI journey narrated by Gen, a generative AI model.





Source link

Building Connected AI Agents: The New Technology Stack

0
Building Connected AI Agents: The New Technology Stack


The internet revolutionized how we communicate and work together. Before we had standard protocols like HTTP for websites and SMTP for email, companies struggled with custom integrations and broken systems. Each organization built their own solutions, and nothing worked together smoothly.

Today, AI agents face the exact same problem. These powerful digital assistants can analyze data, write code, and automate business processes. But they work alone, trapped in their own digital silos. One agent might discover important insights about customer behavior while another agent handles support tickets for the same customers, yet they cannot share information or coordinate their efforts.

This isolation limits what AI agents can accomplish. However, change is coming. A new technology stack is emerging that will connect AI agents and help them work together like a coordinated team.

The Current Problem: Isolated AI Agents

Companies are rapidly adopting AI agents for various tasks. These agents excel at specific jobs – they write marketing copy, analyze financial data, manage customer relationships, and monitor system performance. But they operate like isolated islands, each unaware of what others are doing.

This creates several serious problems. When agents cannot communicate, they often duplicate work, miss important connections between different business areas, and fail to coordinate their actions. For example, a sales agent might pursue a lead while a support agent simultaneously deals with that same customer’s complaint, but neither agent knows about the other’s activities.

The technical infrastructure makes this worse. Most AI agents today use custom-built connections to access tools and data. Developers create unique integrations for each agent, making the systems fragile and difficult to maintain. When something breaks, it often takes the entire system down.

Current agent frameworks also lack consistency. Some treat agents like chatbots that respond to individual requests. Others view them as workflow engines that follow predetermined steps. Still others design them as planning systems that figure out their own approach to problems. This inconsistency makes it nearly impossible to create agents that work together effectively.

Most importantly, existing systems provide no backbone for collaboration. Agents cannot easily share what they learn, coordinate their activities, or build on each other’s work. Everything happens through direct connections or gets buried in log files that other agents cannot access.

The Solution: Four Key Technologies Working Together

The solution requires four essential technologies working as a unified stack. Think of this as the foundation that will enable AI agents to collaborate effectively:

Agent-to-Agent Protocol (A2A) – This gives agents a standard way to discover and communicate with each other, similar to how HTTP allows websites to communicate.

Model Context Protocol (MCP) – This standardizes how agents use tools and access external systems, ensuring they can reliably interact with databases, APIs, and other resources.

Apache Kafka – This provides a robust messaging system that allows agents to share information reliably and at scale, even when some agents are temporarily unavailable.

Apache Flink – This processes streams of information in real-time, enabling agents to react quickly to events and coordinate complex workflows.

Together, these technologies create what experts call the KAMF Stack – a foundation for building connected AI agent systems.

How Agents Discover and Communicate: The A2A Protocol

Google developed the Agent-to-Agent (A2A) protocol to solve the communication problem between AI agents. Just as HTTP created a standard way for web browsers to request information from servers, A2A establishes a standard way for agents to find and collaborate.

The protocol works through several key mechanisms. First, agents announce their capabilities using an AgentCard, which functions like a business card that describes what the agent can do and how other agents can request its help. This eliminates the guesswork about which agent handles which tasks.

Second, agents send structured requests to each other using a format called JSON-RPC. When one agent needs help, it can send a clear request to another agent and receive a structured response. This enables reliable and predictable interactions between different AI systems.

Third, agents can stream updates using Server-Sent Events (SSE). This means that when one agent starts a long-running task, it can provide real-time updates to other agents about its progress. This prevents agents from waiting indefinitely or assuming a task has failed.

Fourth, agents exchange rich content beyond simple text messages. They can share files, structured data, forms, and other complex information types, enabling sophisticated collaboration on complex business processes.

Finally, the protocol includes built-in security features. All communications use HTTPS encryption, and the system supports authentication and permission controls to ensure only authorized agents can access sensitive capabilities.

While A2A handles communication between agents, Anthropic’s Model Context Protocol (MCP) standardizes how agents interact with tools and external systems. This protocol ensures that agents can reliably access databases, call APIs, run scripts, and integrate with business applications.

Before MCP, developers had to create custom integrations for each tool an agent needed to use. This created brittle connections that often broke when systems were updated or configurations changed. MCP solves this by providing a standard interface that works across different tools and platforms.

The protocol defines clear methods for agents to discover available tools, understand their capabilities, and invoke them safely. When an agent needs to query a database, call a web service, or execute a function, it uses standardized MCP commands that work consistently across different environments.

MCP also handles context management, helping agents maintain awareness of their working environment and available resources. This prevents the confusion and errors that occur when agents lose track of their capabilities or try to use tools that are not available.

Together, A2A and MCP provide the foundation for agent collaboration. MCP gives individual agents reliable access to tools and data, while A2A enables multiple agents to work together on complex tasks.

Why Protocols Alone Are Not Enough

Having standard protocols like A2A and MCP represents important progress, but protocols alone cannot solve the scalability and reliability challenges of enterprise AI systems. Consider an analogy: imagine running a large company where employees can only communicate through direct, one-on-one conversations.

In such a company, sharing information becomes exponentially more difficult as the organization grows. Each person must know who to contact for different types of information, track down individual colleagues when they need help, and manually relay messages between different teams. This approach might work for small groups, but it becomes chaotic and inefficient at scale.

The same problem affects AI agent systems that rely solely on direct connections. As companies deploy more agents, the number of required connections grows exponentially. Each agent must be aware of every other agent it might need to collaborate with, creating a complex web of dependencies that becomes increasingly difficult to manage.

Direct connections also create reliability problems. When one agent becomes unavailable, all the agents that depend on it may fail or become stuck waiting for responses. The system lacks resilience because there is no buffer or alternative path for information flow.

Additionally, direct connections make it difficult to observe and debug agent behavior. When agents communicate only through private channels, administrators cannot easily track what information flows through the system, diagnose problems, or replay events to understand what went wrong.

Event-Driven Architecture: The Missing Foundation

The solution to these scalability and reliability challenges lies in event-driven architecture. Instead of requiring agents to communicate directly with each other, an event-driven system allows agents to publish information about their activities and subscribe to information from other agents.

This approach transforms agent communication from a network of point-to-point connections into a broadcast system. When an agent completes a task, discovers an insight, or needs help, it publishes an event to a central messaging system. Other agents can subscribe to the types of events they are interested in and respond accordingly.

Event-driven architecture provides several critical benefits for AI agent systems. It decouples agents from each other, meaning they do not need to know specific details about other agents to collaborate effectively. It provides durability, ensuring that important information is not lost when individual agents become unavailable. It enables replay and debugging, allowing administrators to trace the flow of events through the system and understand how decisions were made.

Most importantly, event-driven architecture scales naturally. Adding new agents to the system does not require reconfiguring existing agents or creating new direct connections. New agents subscribe to relevant event streams and begin participating in the collaborative workflow.

Apache Kafka: The Messaging Backbone

Apache Kafka serves as the messaging backbone for event-driven AI agent systems. Originally developed at LinkedIn to handle massive streams of user activity data, Kafka has become the standard platform for building scalable, real-time data pipelines.

Kafka organizes information into topics, which function like channels or feeds that agents can publish to and subscribe to. When an agent completes a task, it publishes an event to the appropriate topic. Other agents subscribe to topics that contain information relevant to their responsibilities.

The platform provides several features that make it ideal for AI agent systems. First, Kafka ensures durability by storing all events on disk and replicating them across multiple servers. This means that even if some servers fail, the event history remains available and agents can continue working.

Second, Kafka supports high throughput and low latency, handling millions of events per second while maintaining fast response times. This enables real-time coordination between agents even in large, busy systems.

Third, Kafka maintains a complete, time-ordered log of all events. This creates an audit trail that administrators can use to understand system behavior, debug problems, and replay events when necessary. For AI systems, this observability is crucial for maintaining trust and reliability.

Fourth, Kafka decouples event producers from consumers. Agents that publish events do not need to know which other agents will consume those events. This flexibility enables easy addition of new agents, modification of existing workflows, and adaptation of the system as business requirements evolve.

While Kafka handles the movement and storage of event streams, Apache Flink processes those streams in real-time to enable intelligent coordination and decision-making. Flink transforms raw event streams into actionable insights and coordinated responses.

Flink excels at several types of stream processing that are essential for AI agent systems. It can filter events to identify patterns or anomalies that require attention. It can enrich events by combining information from multiple sources to provide complete context. It can aggregate events over time windows to identify trends or calculate metrics. It can join different event streams to correlate activities across different parts of the system.

Most importantly for AI agents, Flink can maintain state across long-running processes. Many business workflows require multiple steps that happen over extended periods. Flink can track the progress of these workflows, ensure that all necessary steps are completed successfully, and handle failures gracefully.

Flink also provides exactly-once processing guarantees, meaning that each event is processed exactly once, even if parts of the system fail and restart. This reliability is crucial for business-critical processes where duplicate or missed actions could cause serious problems.

The combination of Kafka and Flink creates a powerful foundation for agent coordination. Kafka ensures that all agent activities are captured and shared reliably, while Flink processes those activities to trigger appropriate responses and maintain system-wide coordination.

The Complete Stack in Action

The four technologies work together to create a comprehensive platform for connected AI agents. Here is how they collaborate in a typical enterprise scenario:

An AI agent responsible for monitoring customer satisfaction analyzes support ticket data and discovers that customers are experiencing unusually high wait times. Using MCP, the agent reliably accesses the support ticket database and calculates relevant metrics. It then publishes a “HighWaitTimes” event to a Kafka topic.

A Flink stream processing job continuously monitors customer satisfaction events. When it detects the high wait times event, it correlates this information with other recent events, such as staff scheduling changes and system performance metrics. Based on this analysis, Flink triggers a “StaffingAlert” event.

An agent responsible for workforce management subscribes to staffing alerts. When it receives the alert, it uses A2A protocol to communicate with the scheduling agent, requesting information about available staff members. The scheduling agent responds with current availability data.

The workforce management agent then uses MCP to access the staff scheduling system and automatically assigns additional support representatives to reduce wait times. It publishes a “StaffingAdjustment” event to keep other agents informed of the change.

A reporting agent subscribed to staffing events captures this information and updates executive dashboards in real-time, ensuring that management stays informed about both the problem and the automated response.

Throughout this entire process, all events are logged in Kafka, creating a complete audit trail. Administrators can trace exactly how the system detected the problem, what decisions were made, and what actions were taken. This transparency builds trust in the automated system and helps identify areas for improvement.

Benefits of the Connected Agent Stack

The KAMF stack provides several significant advantages over isolated agent systems. First, it enables true collaboration between agents, allowing them to share insights, coordinate activities, and build on each other’s work. This collaborative intelligence often produces better results than individual agents working alone.

Second, the stack provides built-in observability and debugging capabilities. All agent activities are captured in event streams, making it easy to understand system behavior, identify problems, and optimize performance. This transparency is crucial for maintaining reliable AI systems in production environments.

Third, the architecture scales naturally as organizations add more agents. New agents can join existing event streams without requiring changes to existing agents or complex integration projects. This scalability enables organizations to expand their AI capabilities without major system disruptions gradually.

Fourth, the stack provides resilience and fault tolerance. When individual agents fail or become unavailable, the event-driven architecture ensures that important information is not lost, allowing other agents to continue working. The system can recover gracefully from failures and maintain business continuity.

Finally, the stack enables continuous learning and improvement. By analyzing event streams over time, organizations can identify patterns, optimize workflows, and discover new opportunities for automation. The complete event history provides rich data for training and improving AI models.

Implementation Considerations

Organizations considering the KAMF stack should plan carefully for successful implementation. First, they need to establish clear event schemas and naming conventions to ensure consistent communication between agents. Without standardized event formats, agents may misinterpret information or overlook relevant events.

Second, they should design appropriate topic structures in Kafka to logically organize different types of events. Well-designed topic hierarchies make it easier for agents to subscribe to relevant information and avoid being overwhelmed by irrelevant events.

Third, they need to implement proper security and access controls. Event streams often contain sensitive business information, so organizations must ensure that only authorized personnel can access the relevant data streams.

Fourth, they should establish monitoring and alerting for the underlying infrastructure. While the KAMF stack provides resilience, the Kafka and Flink systems themselves require monitoring to ensure optimal performance and reliability.

Finally, organizations should start with pilot projects that demonstrate value before scaling to enterprise-wide deployments. Beginning with limited use cases allows teams to gain experience with the technology and refine their approaches before tackling more complex scenarios.

The Future of Connected AI Agents

The emergence of the KAMF stack represents a fundamental shift in how we think about AI systems. Instead of building isolated, special-purpose agents, organizations can now create collaborative agent ecosystems that work together intelligently and efficiently.

This shift mirrors the evolution of the early internet. Just as HTTP and SMTP enabled unprecedented global connectivity and collaboration, A2A and MCP protocols combined with Kafka and Flink infrastructure will enable new forms of automated intelligence and coordination.

We are moving toward a future where AI agents communicate as naturally as humans do, sharing information seamlessly and coordinating complex activities across organizational boundaries. This connected intelligence will unlock new possibilities for automation, optimization, and innovation that isolated agents simply cannot achieve.

Organizations that adopt this connected approach early will gain significant competitive advantages. They will be able to deploy AI capabilities more quickly, adapt to changing business requirements more flexibly, and achieve higher levels of automation and efficiency.

However, realizing this vision requires commitment to open standards and collaborative development. Just as the internet succeeded because it was built on open protocols and shared infrastructure, the connected agent ecosystem will succeed only if organizations work together to adopt common standards and contribute to shared platforms.

The KAMF stack provides the foundation for this collaborative future. By combining proven protocols with robust infrastructure, it offers a practical path toward building AI agent systems that are not just intelligent but truly collaborative and production-ready.

The future belongs to organizations that can harness not just individual AI capabilities, but collective AI intelligence. The tools to build that future are available today.



Source link

Leading Electrical Distributor Selects HawkSearch to Power Unified Enterprise Search Across Over 70 Sites | Web3Wire

0
Leading Electrical Distributor Selects HawkSearch to Power Unified Enterprise Search Across Over 70 Sites | Web3Wire


Unified AI-powered search connects products, content, and events to grow revenue

WOBURN, MA / ACCESS Newswire / June 19, 2025 / Bridgeline Digital Inc. (NASDAQ:BLIN), a leader in AI-driven marketing technology, announced today that a national electrical distributor has chosen HawkSearch to power a unified AI search experience across more than 70 digital storefronts.

The electrical distributor integrates HawkSearch with Sitecore CMS and Salesforce Commerce Cloud to create a unified search that supports product discovery, content navigation, and event promotion. Unified search powered by AI ensures users find the most relevant results across all systems, from products to events, in a single experience.

With HawkSearch, searching for “EV charging accessories” can receive relevant results from multiple sources, such as product listings, related blog content, installation guides, and upcoming webinars, regardless of where the content is stored. This improves discovery, shortens the buyer journey, and creates a cohesive digital experience across all storefronts, which ultimately drives more revenue.

“This deployment highlights HawkSearch’s ability to unify search across multiple complex digital environments,” said Ari Kahn, CEO of Bridgeline. “We’re excited to help this distributor create a more powerful and personalized search experience.”

About Bridgeline Digital

Bridgeline helps companies grow online revenue by increasing traffic, conversion rates, and average order value. To learn more, visit http://www.bridgeline.com.

Contact:

Danielle ColvinSVP of MarketingBridgeline Digital[email protected]

SOURCE: Bridgeline Digital

About Web3Wire Web3Wire – Information, news, press releases, events and research articles about Web3, Metaverse, Blockchain, Artificial Intelligence, Cryptocurrencies, Decentralized Finance, NFTs and Gaming. Visit Web3Wire for Web3 News and Events, Block3Wire for the latest Blockchain news and Meta3Wire to stay updated with Metaverse News.



Source link

Treasury Secretary Bessent Says Stablecoins Can Bolster US Dollar ‘Supremacy’ – Decrypt

0
Treasury Secretary Bessent Says Stablecoins Can Bolster US Dollar ‘Supremacy’ – Decrypt



In brief

Treasury Secretary Scott Bessent said stablecoins could strengthen dollar dominance and urged swift passage of federal crypto legislation.
His comments came as Trump pushed Congress to fast-track the GENIUS Act after Senate approval, reversing last month’s failed procedural vote.
Industry leaders welcomed the move but warned that political infighting and perceived conflicts could undermine trust in the bill.

Treasury Secretary Scott Bessent said Wednesday, “Stablecoins can reinforce dollar supremacy,” pushing back against critics who view crypto as a threat to America’s currency dominance, as President Trump urged Congress to fast-track landmark legislation.

“Crypto is not a threat to the dollar,” Bessent tweeted on Wednesday, calling digital assets “one of the most important phenomena in the world right now” that have been “ignored by national governments for far too long.”

The comments came as Trump pressed House lawmakers to quickly pass the GENIUS Act after the Senate approved the stablecoin framework on Tuesday. 

Tuesday’s Senate passage marked a reversal from last month, when the GENIUS Act failed a procedural vote after pro-crypto Democrats withdrew support over concerns about national security provisions and conflicts of interest from the Trump family. 

Bessent condemned that earlier defeat, warning that “the world is watching while American lawmakers twiddle their thumbs” and urging Congress to “either step up and lead or watch digital asset innovation move offshore.”

The legislation would establish federal rules for issuing and trading stablecoins—digital tokens typically pegged to the U.S. dollar.



“Stablecoins could end up being one of the largest buyers of U.S. treasuries or T-bills,” the Treasury Secretary said in a New York Post interview, explaining how someone using a dollar-backed stablecoin in Nigeria could transact without actually holding physical dollars.

“I think there’s a very good chance that crypto is actually one of the things that locks in dollar supremacy,” Bessent said, noting the Biden administration tried to “make it extinct” rather than embrace innovation. 

Industry leaders welcomed the Senate passage, while acknowledging ongoing political tensions. 

Ira Auerbach, Head of Tandem at Offchain Labs, told Decrypt that “the continued political divide on crypto is creating a market operating under a ‘best guesses’ framework, and that’s becoming untenable for an industry growing at breakneck speed.”

Auerbach pointed out that stablecoins “require a different legislative approach than digital assets like memecoins or trading tokens,” saying that these are “separate issues” where speculative concerns “shouldn’t be allowed to impede” payment infrastructure development. 

However, concerns persist about conflicts of interest. 

While speaking to Decrypt, Alexander Urbelis, General Counsel at ENS Labs,  warned that the GENIUS Act’s “perceived entanglement to the Trump family’s private interests” could “erode trust and credibility in the legislative process” and fuel “political theatrics” over crypto’s supposed risks.

Urbelis cautioned that in an era of “deep fakes amplified by social media platforms that have abandoned fact checking,” conspiracy theories about dollar mismanagement could “undermine public trust” and have “global consequences.”

Edited by Sebastian Sinclair

Daily Debrief Newsletter

Start every day with the top news stories right now, plus original features, a podcast, videos and more.



Source link

Nura Labs announces the release of Nura Wallet on Google Play Store | Web3Wire

0
Nura Labs announces the release of Nura Wallet on Google Play Store | Web3Wire


Sheridan, WY , June 18, 2025 (GLOBE NEWSWIRE) —

Patent-protected Nura Wallet NOW LIVE on Google Play Store with $NURA token exploding onto DEX exchanges June 18th – The AI wallet that actually makes you money is here

Your crypto just became intelligent. The revolutionary Nura Wallet is NOW AVAILABLE on Google Play Store, featuring the world’s first AI that automatically maximizes your DeFi profits 24/7. Tomorrow, June 18th, the $NURA token launches on DEX exchanges, completing the first crypto ecosystem where your wallet literally works to make you richer.

Download the future NOW: https://play.google.com/store/apps/details?id=com.nuralab

This isn’t just another wallet launch. This is the commercial debut of patent-protected AI technology (US 63/819,482) that’s about to bring a billion new users into DeFi.

The Autonomous Machine Is Live and Ready

The Nura Wallet does what every crypto holder has dreamed of: it automatically finds the best yields across hundreds of DeFi protocols and optimizes your returns around the clock. No more missing opportunities. No more complex strategies. Just pure, automated profit maximization – and it’s available RIGHT NOW on your Android device.

“We just made every other crypto wallet obsolete,” said the Nura Labs team. “Our AI doesn’t just store your crypto – it actively works to multiply it. Users are seeing 23% higher yields and 67% lower fees. This is the iPhone moment for DeFi, and everyone can download it today.”

$NURA Token: The Fuel Behind The Autonomous Machine

Tomorrow’s $NURA token launch isn’t just another token drop – it’s the key that unlocks premium AI features that could transform your portfolio:

– Premium AI Optimization – Advanced algorithms that crush basic strategies

– Fee Elimination – Massive transaction cost reductions

– Exclusive Yields – Access to premium staking opportunities

– Governance Power – Vote on which protocols get integrated next

– Compounding Rewards – Stake $NURA for additional yield amplification

This token has REAL utility from day one through the live wallet application you can download right now.

The Numbers That Will Blow Your Mind

Users who already downloaded the app are going absolutely crazy over the results:

– 23% higher staking yields than manual management

– 67% lower transaction costs through AI optimization

– 89% less time required – literally set and forget

– 94% user satisfaction – people are obsessed

– 5,000+ beta users already maximizing profits with the live app

These aren’t projections. This is real money being made by real users RIGHT NOW.

Why This Launch Is About To EXPLODE

– Available TODAY: Download immediately from Google Play Store

– Patent Protection: Legal moat around breakthrough technology

– First-Mover Advantage: No competition in AI wallet space

– Massive Market: $180B DeFi market with <10% penetration

– Real Utility: Token powers live, working application

– Mobile-First: Perfect for 100M+ smartphone crypto users

The Revolution Starts RIGHT NOW

TODAY: Download Nura Wallet and experience the future

TOMORROW: $NURA token trading begins on major DEX platforms

THIS WEEK: Watch your crypto start working harder than it ever has

The beta is over. The waiting is over. The DeFi complexity crisis is OVER.

Get The App That’s Changing Everything

Download Nura Wallet NOW:

https://play.google.com/store/apps/details?id=com.nuralab

$NURA Trading Begins: June 18, 2025 on major DEX platforms

Website: https://nura.gg/

Token Launch Details

$NURA Trading Begins: June 18, 2025

Initial Pairs: $NURA/ETH, $NURA/USDC

Exchanges: Major DEX platforms

Utility: Live from day one through Nura Wallet

About Nura Labs

Nura Labs just solved crypto’s biggest problem: making DeFi work for everyone. Our patent-protected AI technology transforms any smartphone into an institutional-grade profit machine. The complexity barrier is DEAD. The profit barrier is DEMOLISHED.

 DOWNLOAD NOW. YOUR CRYPTO IS WAITING TO GET SMARTER.

Disclaimer: The information provided in this press release is not a solicitation for investment, nor is it intended as investment advice, financial advice, or trading advice. It is strongly recommended you practice due diligence, including consultation with a professional financial advisor, before investing in or trading cryptocurrency and securities.

About Web3Wire Web3Wire – Information, news, press releases, events and research articles about Web3, Metaverse, Blockchain, Artificial Intelligence, Cryptocurrencies, Decentralized Finance, NFTs and Gaming. Visit Web3Wire for Web3 News and Events, Block3Wire for the latest Blockchain news and Meta3Wire to stay updated with Metaverse News.



Source link

Popular Posts

My Favorites

Anthony Mackie Reveals Who He Would Kick Out Of ‘Avengers’ Group...

0
Captain America himself, Anthony Mackie, isn’t afraid to stir the pot when it comes to his MCU co-stars.In a recent appearance on "First...