Taking stock of recent popular Crypto+AI projects, these three trends have changed significantly

This article is machine translated
Show original

Reviewed several popular projects in the Crypto+AI track over the past month, discovering three notable trend changes, with project brief introductions and comments:

1) Project technical paths becoming more pragmatic, focusing on performance data rather than pure conceptual packaging;

2) Vertical niche scenarios becoming expansion focus, with generalized AI giving way to specialized AI;

3) Capital more concerned with business model verification, with projects having cash flow being clearly favored;

Appendix: Project Overview, Highlights Analysis, Personal Comments:

1, @yupp_ai

Project Overview: Decentralized AI model evaluation platform, completed $33 million seed round in June, led by a16z, with Jeff Dean participating.

Highlights Analysis: Applying human subjective judgment advantages to AI assessment weaknesses. Scoring 500+ large models through human crowdsourcing, with user feedback convertible to cash (1000 points = $1), already attracting data purchases from companies like OpenAI, with real cash flow.

Personal Comments: A project with a relatively clear business model, not a pure cash-burning approach. Anti-sybil attack algorithms need continuous optimization. However, from the $33 million financing scale, capital is clearly more focused on projects with monetization verification.

2, @Gradient_HQ

Project Overview: Decentralized AI computing network, completed $10 million seed round in June, led by Pantera Capital and Multicoin Capital.

Highlights Analysis: Using Sentry Nodes browser extension, already having some market consensus in Solana DePIN field. Team members from Helium, newly launched Lattica data transmission protocol and Parallax inference engine, making substantive exploration in edge computing and data verifiability, reducing latency by 40%, supporting heterogeneous device access.

Personal Comments: Right direction, perfectly capturing the AI localization "sinking" trend. But handling complex tasks requires efficiency comparison with centralized platforms, with edge node stability still an issue. However, edge computing is a new demand from web2 AI involution and a distributed framework advantage of web3 AI, optimistic about promoting landing with actual performance-specific products.

3, @PublicAI_

Project Overview: Decentralized AI data infrastructure platform, incentivizing global users to contribute multi-domain data (medical, autonomous driving, voice, etc.) through token rewards, with cumulative revenue exceeding $14 million, establishing a million-level data contributor network.

Highlights Analysis: Technically integrating ZK verification and BFT consensus algorithm to ensure data quality, using Amazon Nitro Enclaves privacy computing technology to meet compliance requirements. Interestingly launched HeadCap brain wave collection device, expanding from software to hardware. Economic model also well-designed, users can earn $16 + 500,000 points for 10 hours of voice annotation, enterprise data service subscription costs reduced by 45%.

Personal Comments: The project's greatest value seems to be addressing real needs in AI data annotation, especially in medical, autonomous driving fields with extremely high data quality and compliance requirements. However, 20% error rate is still higher than traditional platforms' 10%, and data quality fluctuation is an ongoing issue to solve. Brain-computer interface direction has interesting imagination space, but execution difficulty is not small.

4, @sparkchainai

Project Overview: Solana-based distributed computing network, completed $10.8 million financing in June, led by OakStone Ventures.

Highlights Analysis: Using dynamic sharding technology to aggregate idle GPU resources, supporting large model inference like Llama3-405B, cost 40% lower than AWS. Token-based data transaction design is interesting, directly turning computing resource contributors into stakeholders, incentivizing more network participation.

Personal Comments: Typical "aggregating idle resources" model, logically sound. But 15% cross-chain verification error rate is indeed high, technical stability needs further refinement. However, has advantages in 3D rendering scenarios with lower real-time requirements. Key is whether error rate can be reduced, otherwise even the best business model will be dragged down by technical issues.

5, @olaxbt_terminal

Project Overview: AI-driven cryptocurrency high-frequency trading platform, completed $3.38 million seed round in June, led by @ambergroup_io.

Highlights Analysis: MCP technology can dynamically optimize trading paths, reducing slippage, with actual test efficiency improvement of 30%. Catering to #AgentFi trend, finding an entry point in the relatively blank DeFi quantitative trading细分 segment, filling market demand.

Personal Comments: Direction is correct, DeFi indeed needs smarter trading tools. But high-frequency trading requires extremely high latency and accuracy, with AI prediction and on-chain execution real-time coordination still to be verified. Additionally, MEV attack is a major risk, technical protection measures must keep up.

Source
Disclaimer: The content above is only the author's opinion which does not represent any position of Followin, and is not intended as, and shall not be understood or construed as, investment advice from Followin.
Like
Add to Favorites
Comments