≡ Menu

🛒 Building a Simple E-Commerce Cart in React

Welcome to the ultimate guide for building a core feature of any modern web application: a fully functional shopping cart!

In the world of e-commerce, the shopping cart is more than just a list—it’s the critical hub where user interaction meets application state. As React developers, understanding how to manage this dynamic state is fundamental.

This tutorial dives into a clean, component-driven approach to building a shopping cart using React Hooks (useState). You’ll learn essential techniques like:

  • State Immortality: Updating arrays correctly in React.

  • Prop Drilling: Passing functions and data down through components.

  • Component Composition: Structuring your application into reusable pieces (ProductCard, ProductList, and Cart).

By the end of this guide, you won’t just have a working cart; you’ll have mastered the state management principles needed to build any complex feature in a modern React application.

Let’s dive in and start coding!

🏗️ Project Structure and Setup

 

We need four files in total for this improved structure:

src/
├── components/
│   ├── Cart.js
│   ├── ProductList.js
│   └── ProductCard.js  <-- NEW COMPONENT
├── data.js
└── App.js

1. The Data (data.js)

 

Our list of products remains the same:

// data.js
export const products = [
  { id: 1, name: 'Laptop', price: 1200 },
  { id: 2, name: 'Mouse', price: 25 },
  { id: 3, name: 'Keyboard', price: 75 },
  { id: 4, name: 'Monitor', price: 300 },
];

📦 Presentation Components

 

2. Product Card (components/ProductCard.js)

 

This component handles the display of a single product and provides the button to interact with the cart.

// components/ProductCard.js
import React from 'react';

const ProductCard = ({ product, onAddToCart }) => {
  return (
    <div 
      style={{ border: '1px solid #eee', padding: '15px', borderRadius: '5px' }}
    >
      <h3>{product.name}</h3>
      <p>**${product.price.toFixed(2)}**</p>
      {/* The button triggers the onAddToCart function passed down from App.js */}
      <button 
        onClick={() => onAddToCart(product)} 
        style={{ padding: '10px', backgroundColor: 'teal', color: 'white', border: 'none', cursor: 'pointer' }}
      >
        Add to Cart
      </button>
    </div>
  );
};

export default ProductCard;

3. Product List (components/ProductList.js)

 

This component is the container. It receives the list of products and the onAddToCart function as props, and its sole job is to map over the list and render a ProductCard for each item.

// components/ProductList.js
import React from 'react';
import ProductCard from './ProductCard'; // <-- Import the Card

const ProductList = ({ products, onAddToCart }) => {
  return (
    <div style={{ display: 'grid', gridTemplateColumns: 'repeat(2, 1fr)', gap: '20px' }}>
      {products.map(product => (
        <ProductCard 
          key={product.id} 
          product={product} 
          onAddToCart={onAddToCart} 
        />
      ))}
    </div>
  );
};

export default ProductList;

4. Cart Display (components/Cart.js)

This component remains the same, responsible for displaying the items and the total.

// components/Cart.js
import React from 'react';

const Cart = ({ items, total, onRemoveFromCart }) => {
  return (
    <div>
      {/* ... (Cart rendering logic remains the same) ... */}
      {items.length === 0 ? (
        <p>Your cart is empty.</p>
      ) : (
        <>
          <ul style={{ listStyle: 'none', padding: 0 }}>
            {items.map(item => (
              <li 
                key={item.id} 
                style={{ display: 'flex', justifyContent: 'space-between', marginBottom: '10px', borderBottom: '1px dotted #ccc', paddingBottom: '5px' }}
              >
                <span>{item.name} (x{item.quantity})</span>
                <span>
                  **${(item.price * item.quantity).toFixed(2)}**
                  <button 
                    onClick={() => onRemoveFromCart(item.id)}
                    style={{ marginLeft: '10px', backgroundColor: 'red', color: 'white', border: 'none', cursor: 'pointer', padding: '5px 8px' }}
                  >
                    -
                  </button>
                </span>
              </li>
            ))}
          </ul>
          <hr />
          <h3>Total: **${total.toFixed(2)}**</h3>
        </>
      )}
    </div>
  );
};

export default Cart;

️ The Main Application Logic (App.js)

 

This file is the central hub for state management and passing down the necessary props (data and functions) to its children.

// App.js
import React, { useState } from 'react';
import { products } from './data';
import ProductList from './components/ProductList'; // <-- Imported
import Cart from './components/Cart';

function App() {
  const [cartItems, setCartItems] = useState([]);

  // Function to add an item to the cart (Logic remains the same)
  const addToCart = (product) => {
    const existingItem = cartItems.find(item => item.id === product.id);

    if (existingItem) {
      setCartItems(
        cartItems.map(item =>
          item.id === product.id
            ? { ...item, quantity: item.quantity + 1 }
            : item
        )
      );
    } else {
      setCartItems([...cartItems, { ...product, quantity: 1 }]);
    }
  };

  // Function to remove an item or decrease quantity (Logic remains the same)
  const removeFromCart = (productId) => {
    setCartItems(
      cartItems.reduce((acc, item) => {
        if (item.id === productId) {
          if (item.quantity > 1) {
            acc.push({ ...item, quantity: item.quantity - 1 });
          }
        } else {
          acc.push(item);
        }
        return acc;
      }, [])
    );
  };

  // Calculate the total cost
  const cartTotal = cartItems.reduce(
    (total, item) => total + item.price * item.quantity,
    0
  );

  return (
    <div className="App" style={{ padding: '20px' }}>
      <h1>️ Simple E-Commerce App</h1>
      <div style={{ display: 'flex', gap: '40px' }}>
        
        {/* Product List Section - Uses the ProductList container */}
        <section style={{ flex: 2 }}>
          <h2>Products</h2>
          <ProductList 
            products={products} 
            onAddToCart={addToCart} // <-- Passing the function down
          />
        </section>

        {/* Cart Section */}
        <section style={{ flex: 1, borderLeft: '1px solid #ccc', paddingLeft: '20px' }}>
          <h2>Your Cart</h2>
          <Cart
            items={cartItems}
            total={cartTotal}
            onRemoveFromCart={removeFromCart}
          />
        </section>
      </div>
    </div>
  );
}

export default App;

Benefits of Component Separation

 

By splitting the product display into two components, we achieve better React practices:

  • ProductList: Responsible for listing (the structure and mapping).

  • ProductCard: Responsible for presenting a single item’s details and handling its specific action (the button).

This makes it easy to change the appearance of a product card without touching the listing logic, and vice versa.

Useful links below:

Let me & my team build you a money making website/blog for your business https://bit.ly/tnrwebsite_service

Get Bluehost hosting for as little as $1.99/month (save 75%)…https://bit.ly/3C1fZd2

Best email marketing automation solution on the market! http://www.aweber.com/?373860

Build high converting sales funnels with a few simple clicks of your mouse! https://bit.ly/484YV29

Join my Patreon for one-on-one coaching and help with your coding…https://www.patreon.com/c/TyronneRatcliff

Buy me a coffee ☕️https://buymeacoffee.com/tyronneratcliff

{ 0 comments }

langchain

Why Just an LLM Isn’t Enough

Large Language Models (LLMs) like GPT-4, Claude, or LLaMA are incredible at generating text.

However, a raw LLM is essentially a stateless, isolated text-generation engine.

It can’t remember past conversations, access up-to-date information, or perform complex, multi-step tasks.

This is where LangChain comes in.

LangChain is an open-source framework designed to help developers build data-aware and agentic applications by connecting LLMs to external data sources, computation, and memory.

It turns a simple text generator into a sophisticated, multi-tool workflow.

Understanding the Core Components of LangChain

 

LangChain’s power comes from its modular architecture. Every complex LLM application you build is a “chain” or a “graph” of these simple, interchangeable parts.

Component Purpose Analogy
LLMs/Chat Models The engine. Interfaces for any language model (OpenAI, Anthropic, local models, etc.). The Brain
Prompt Templates Standardized blueprints for sending input to the LLM. The Script (telling the brain what to say)
Chains/LCEL Sequences of components that execute in order. The Workflow (step 1 -> step 2 -> step 3)
Retrieval (RAG) Connecting the LLM to external, proprietary, or up-to-date data. The Knowledge Base
Agents & Tools Allows the LLM to choose an action (tool) to take based on the input. The Decision Maker & Hands
Memory Stores conversation history for multi-turn interactions. The Short-term Memory

Tutorial: Building Your First Simple Chain (LCEL)

 

The modern way to build in LangChain is using the LangChain Expression Language (LCEL), which allows for declarative, chainable, and highly efficient pipelines.

Step 1: Setup and Installation

# Install the core library and the OpenAI integration
! pip install langchain langchain-openai

import os
from langchain_openai import ChatOpenAI
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.output_parsers import StrOutputParser

# Set your API Key (Best practice is to use environment variables)
# os.environ["OPENAI_API_KEY"] = "YOUR_API_KEY"

Step 2: Define the Components

 

We’ll define three core components: the model, the prompt, and the output parser.

A. The Chat Model

 

We initialize the model interface. We use ChatOpenAI because it’s built for conversational inputs.

# 1. Initialize the LLM (The Brain)
llm = ChatOpenAI(model="gpt-4o-mini", temperature=0.7)

B. The Prompt Template

 

This template defines the System and Human roles for the conversation.

# 2. Define the Prompt Template (The Script)
prompt = ChatPromptTemplate.from_messages([
    ("system", "You are a professional chef. Answer all questions with a culinary twist, focusing on simple, delicious recipes."),
    ("user", "{user_input}") # This is our dynamic input variable
])

C. The Output Parser

 

The LLM returns a complex object, but we often just want a plain string. The parser handles this conversion.

# 3. Define the Output Parser (The Formatter)
parser = StrOutputParser()

Step 3: Chain the Components with LCEL

 

We use the pipe operator (|) to connect these components into a single, cohesive application called a chain.

chain = prompt => llm => parser
# 4. Chain the components using LCEL (The Workflow)
chain = prompt | llm | parser

Step 4: Invoke the Chain

 

Run the chain by passing the user’s input as the user_input variable defined in the prompt.

# 5. Invoke the chain
user_query = "I need a quick dinner idea for a weeknight with chicken."

response = chain.invoke({"user_input": user_query})

print(f"**Query:** {user_query}\n")
print(f"**Response:** {response}")

Expected Output (Example)

 

Response: *Ah, a weeknight dinner! We need something fast, flavorful, and reliable. Let’s whip up a 15-Minute Lemon Herb Chicken Sauté. Think of it as a flawless mise en place for your evening. Simply slice your chicken breast thin, sauté with olive oil, garlic, and a generous pinch of dried oregano. Deglaze with a splash of white wine (or chicken broth), finish with a squeeze of fresh lemon juice, and toss with some pre-cooked rice or quickly steamed green beans. Bon Appétit!


Beyond the Basics: Retrieval-Augmented Generation (RAG)

 

The single most impactful use case for LangChain is Retrieval-Augmented Generation (RAG).

RAG allows your LLM to answer questions about specific, private, or current data (like company documents, recent news, or a personal knowledge base) by retrieving relevant documents before generating the final answer.

How it works:

  1. Load: Load your documents (PDFs, websites, etc.) using a Document Loader.

  2. Split: Use a Text Splitter to break large documents into smaller chunks.

  3. Embed & Store: Convert these chunks into vector embeddings and store them in a Vector Store (e.g., Chroma, FAISS).

  4. Retrieve: When a user asks a question, a Retriever finds the top-K relevant document chunks.

  5. Generate: The LLM is given the user’s question and the retrieved chunks (the context) to generate a grounded, accurate answer.

RAG Component Map

 

Component Code Abstraction
External Data DocumentLoader
Vector Database VectorStore
Search Mechanism Retriever
Final Workflow create_retrieval_chain

Conclusion: LangChain is Your LLM Toolkit

 

LangChain is more than a library—it’s an opinionated approach to building sophisticated, production-ready LLM applications. By mastering its core components (Prompts, Models, and Chains/LCEL), you can connect the power of AI to the real world, transforming raw LLMs into intelligent, context-aware systems.

Useful links below:

Let me & my team build you a money making website/blog for your business https://bit.ly/tnrwebsite_service

Get Bluehost hosting for as little as $1.99/month (save 75%)…https://bit.ly/3C1fZd2

Best email marketing automation solution on the market! http://www.aweber.com/?373860

Build high converting sales funnels with a few simple clicks of your mouse! https://bit.ly/484YV29

Join my Patreon for one-on-one coaching and help with your coding…https://www.patreon.com/c/TyronneRatcliff

Buy me a coffee ☕️https://buymeacoffee.com/tyronneratcliff

{ 0 comments }