• AI INSIGHTS
  • Posts
  • iPhone Killer AI Device? | Grok This! The Sarcastic LLM | Chat GPT Widens The Gap

iPhone Killer AI Device? | Grok This! The Sarcastic LLM | Chat GPT Widens The Gap

Cut through the noise - here are the top stories in AI

AI Intelligence Briefing: No Content Without Context

More signal, less noise.

Humane AI: The Star Trek Com-Badge Is Here! (Is This An iPhone-killer?)

So…just a thought…Tim Cook might want to take a closer look at Humane AI and afterwards, he might want to talk to his CFO and figure out what it’s going to take to bring them into the Apple fold ASAP because Humane’s product, which is hitting the market at less than half the price of a fully loaded iPhone 15 Pro, could reduce or eliminate the need to carry a smartphone.

What is the Humane AI pin? It’s the Star Trek Com Badge - a wearable device that connects to an AI platform (through T-Mobile). It has a built-in camera and projector and it represents a whole new way to interface with the online and physical worlds. Humane AI was co-founded by Imran Chaudhri and Bethany Bongiorno, two ex-Apple employees with a vision of a hands-free, screen-free device experience powered by AI and using voice, vision, and gestures as the primary interfaces to the platform. With AI partnership with Open AI and Microsoft, connectivity through T-Mobile, built-in music through Tidal and chips provided by Qualcomm, Humane AI is taking orders starting November 16, 2023.

Grok This! Say Hello To My Sarcastic Little Friend!

So this past week, Elon Musk showed up to the AI party with Grok, his first real LLM offering from X.ai. It’s still in beta, but at first glance, it’s main differentiators seemed be the ability to get sarcastic replies and the fact that it is trained on X (the social media platform, not the drug) in real-time. Meanwhile

As one of the original founders of Open AI, Elon Musk has had his hands in AI for a while. After the Open AI board declined Musk’s offer to run the organization (which was non-profit at the time), Musk resigned from the board and focused his AI efforts on Tesla and now xAI. Now Grok, the first LLM from X.ai is in limited beta release, and the tech press is buzzing, circulating the same cherry-picked screenshots of Grok in action. Here’s what we know so far:

  • Grok is sarcastic, and in some situations will give you flippant answers

  • Grok has a real-time knowledge base via the X-platform, including Twitter.

  • Against a random (or possibly cherry-picked) benchmark (National Hungarian High School Math Test), Grok scored decently against other LLMs

  • Access to Grok will be through premium X subscription

  • Despite Musk’s pseudo-libertarian rhetoric about Chat GPT and other LLMs, Grok still has its guard rails.

  • Grok is not multi-modal, it is still text driven.

  • Grok has a context window of 28,000 tokens (well short of other LLMs such as Chat GPT and Claude 2, each of which is over 100,000 ). The context window determines how much information you can put in a prompt to help the LLM respond appropriately.

  • A version of Grok will run natively in Teslas using local compute power.

In short, at this stage, Grok’s single differentiator is that it is specifically designed to be an AI representation of Elon Musk’s personality, sense of humor, cultural and political biases. Whether that is a bug or a feature depends on your point of view.

The question of how robust Grok’s guard rails actually are still remains to be seen. While Grok’s response to a query about how to make cocaine at home was flippant, it ultimately says that of course it can’t give out that kind of information. But Grok’s attempt at a humorous response to a serious threat seems to almost beg the real bad actors (or even the non-malicious but curious) to test Grok to see if a little advanced prompting can’t wiggle this information out of our fun-loving new AI friend.

Grok is also obviously slated to be the AI nervous system of the Elonverse (tying together X (social media), Tesla cars and home products, Boring Co. (tunnels) and Space X). Strategically tying its roll out to premium Twitter users is a Hail Mary. but in the short term, it might at least get a much-needed first down for X’s QB Linda Yaccarino.

The problem long term is that Musk’s views are not shared by all of his customers, as evidenced by the tremendous loss of users and advertisers on Twitter. And while many Tesla owners might overlook his views because they love the cars, they may not feel the same way when their personal vehicle starts taking on his opinions.

This first iteration of Grok does not look like serious competition for Chat GPT or Claude 2. It might be more competitive against Bard, but all its competitors have a free version, while early access to Grok apparently is tied to having an X premium plan. Many would-be early testers or adopters will not want to pay to beta-test the app. And while $16 is not outrageous, sarcasm is available all over the web for free.

CHAT GPT: Widening the Gap

Open AI had its dev conference last week and let the AI world know that Chat GPT was still the industry leader and rolling out new features at AI-lightning speed. While Google is racing to get Google Gemini ready for company and hedging its bets with huge investments in Anthropic and Character.AI, and X.ai is debuting their Musk-inspired Grok AI, Open AI keeps loading up features in Chat GPT 4 and developing an ecosystem along the way.

Here's an in-depth look at what GPT-4 Turbo offers:

Expanded Context Length for In-Depth Interactions

GPT-4 Turbo now supports an unprecedented context length of up to 128,000 tokens. This allows for more extensive and detailed input and output, so that users can engage in more in-depth and complex interactions than ever before.

JSON Mode for Streamlined API Calls

The JSON mode makes it easier for GPT 4 Turbo to talk in a computer language called JSON, which is really handy when it works with with APIs (a way computers talk to each other). This makes GPT 4 Turbo even better for creating software and handling data

Advanced Function Calling

The model has been upgraded to support multiple function calls simultaneously. This improvement significantly increases the efficiency of GPT-4 Turbo, allowing for a more dynamic and versatile interaction with the model.

Knowledge Update

GPT 4 Turbos knowledge has been updated to include information up until April 2023. This extension of the knowledge cutoff window ensures that GPT-4 Turbo remains a cutting-edge tool for accessing and generating current information.

External Knowledge Integration

The new version allows users to integrate external knowledge sources into their GPT applications, enhancing the model's versatility and applicability in various domains.

Now Multi-modal

GPT-4 Turbo now accepts images as inputs and can generate captions, classifications, and natural-sounding audio for text. This multimodal capability significantly broadens the range of applications for the model.

Customization and Business Applications

Tailored Solutions for Specific Business Needs

OpenAI introduces customization features, allowing for the fine-tuning of models to meet specific business requirements. This customization necessitates collaboration with the OpenAI team and is primarily targeted at enterprise-level applications.

Enhanced Control Over Model Behavior

With these new features, users gain increased control over the model's behavior, enabling more precise and tailored outputs suitable for diverse business applications.

Addressing Copyright Concerns

Legal Support for Model Output-Related Copyright Issues

In response to growing concerns about copyright, OpenAI provides legal support to users who face copyright issues related to model outputs. This support is particularly relevant for users building applications with the API and for enterprise customers.

Pricing Model Adjustments

Reduced Costs for Enhanced Accessibility

OpenAI has significantly reduced the costs for using GPT-4 Turbo. The new pricing structure makes prompt tokens three times cheaper and completion tokens twice as cheap compared to the previous version of GPT-4. This cost reduction is anticipated to drive cost savings for developers and potentially lead to an increase in the development of new AI applications.

Integration with Chat GPT

Enhanced Chat GPT with GPT-4 Turbo

Chat GPT now utilizes GPT-4 Turbo, incorporating the latest improvements such as a larger context window and higher rate limits. This integration offers users a more powerful and efficient tool for various applications.

Seamless Integration with Browsing, Plugins, and DALL·E 3

The latest version of Chat GPT seamlessly integrates browsing capabilities, plugins, and DALL·E 3, offering an all-encompassing tool that eliminates the need to switch between different applications.

Introduction of GPTs for Customized Chat GPT Applications

Building Customized Versions of Chat GPT

GPTs allow users to build customized versions of Chat GPT tailored for specific purposes. This feature enables users to provide unique instructions, expanded knowledge, and actions specific to their requirements.

Roll Your Own GPT

The creation of GPTs is simple, using natural language instructions and no code is necessary. This accessibility broadens the appeal, making it suitable for a wide range of users.

Increased Accessibility and Efficiency

Higher Rate Limits for Established Customers

For existing GPT-4 customers, the token rate limits have been doubled, allowing for more intensive use of the model.

Direct Rate Limit and Quota Adjustments

Users can now request changes to their rate limits and quotas directly within their API account settings, offering greater flexibility and convenience.

The release of GPT-4 Turbo shows that Open AI is not waiting for the competition to catch up. By going multimodal and integrating Dalle, Open AI is going head to head with Midjourney and the other text-to-image AI apps. And by rolling out GPTs, Open AI is encouraging nontechnical creators to build AI apps within their ecosystem. Open AI increased the size of the context window to keep pace with Anthropic’s Claude 2 and stay ahead of Bard and now Grok.