Surviving Big Tech – Post 4

Me, write a book? Why not…

First Book: Building Intelligent .NET Applications

After the MSDN article, I continued writing articles in my spare time for various technical journals. I also switched jobs a couple of times, but left on good terms. I never burned bridges, even when I wanted to.

By 2004, I was working with a team of developers doing .NET development using C#. Besides working, writing articles and taking care of my children, I was fascinated with the world of Artificial Intelligence (AI).

On weekends, I would get my husband to watch the kids so I could go to Barnes and Nobel and read every book I could about AI. There were not many. I think I read them all. I also spent free time at work checking out technologies emerging from Microsoft Research.

One day I had an acquisitions editor from Pearson publishers send me an email. She had read some of my technical articles and wanted to see if I would be interested in writing a book for them. Seriously, I was asked this. Whhhhaaaat???

I thought about it for a week and ended up responding “Sure, why not. What is involved?” She explained that I would need to come up with a book proposal and if approved, it would move forward through their process.

I spent the next 3 months putting together a substantial book proposal for a book centered on using technologies that I learned about trolling the Microsoft Research website. Wanna guess? It was approved.

The only snag was that they hated my title suggestion. It had the word “Artificial Intelligence” in it. This was during an AI Winter and the publisher refused to in her words, “Kill the book with the title”. Yeah, AI was a bad word at that time. We settled with “Building Intelligent .NET Applications”.

I then spent over a year working tirelessly to write code, along with the 30,000 words for the book itself. The book was published in March of 2005. Amazingly, it is still available for sale on Amazon. I actually got someone from Microsoft Research to write a blurb that was featured on the cover of the book. Impressive, right? I sure was impressed.

Spoiler alert: The book was a commercial failure. The title was not the only problem. Despite the financial loss I suffered from writing that book, it paid off in many other ways.

More about those to come…

In this series, I will be getting personal. I will be sharing stories of major events (with lots of candor) that led to my career in this industry. A career which you may eventually be surprised to learn I am still extremely grateful to be a part of. Take what you like and leave the rest please.

Surviving Big Tech – Post 3

The Web Is Amazing

Not the issue that my article was published in

The financial company I was working for was in the middle of a growth spurt. They were spending money like drunk sailors. This meant lots of opportunities. I took advantage of them all.

I volunteered to work with a group in designing a series of classic Active Server Pages (ASP) for my company. This was prior to ASP.NET, so the technology was brand new and very immature. What is now commonly referred to as Developer Experience (DX) was horrible for these pages.

The biggest challenge I faced was designing a page that would not crash when deployed. I loved the technology and could visualize all that the web is today. I thought it was like finding gold. I quickly figured out that I needed some way to stress test these pages before deployment.

I started frantically researching how to stress test my pages. Again, it was mostly trial and error. I learned a lot – the hard way. But, it was all very exciting and I was fixated on the issue.

My communication skills were always way high. I can thank my mother, who was a creative dramatic teacher and college speech professor for that one. For this reason, I would create a lot of developer documentation – just for fun. Seriously. It was fun for me. No one even asked me to do it. By the way, I can just feel your eyes rolling as you read this.

At one point, the company brought in a $2000/day contract resource to mentor all the developers about best practices, tips and tricks, etc. I spoke to him as much as possible.

I showed him one of the doc files I had created for an app I was working on. He was actually impressed. I could not believe someone making $2000/day would be impressed with me. He gave me some advice and I soaked up every word. He encouraged me to try professional tech writing. I was inspired.

I spent all my spare time working on an article for MSDN magazine. The article was about load/stress testing an ASP application. My article was only 1500 words and most articles in that prestige journal were 10,000 words. They were all dry and written like technical white papers.

I was sure my article would be rejected. Imagine my surprise when it was accepted and get this – it was put on the cover page of next months issue. I felt like I had won the lottery.

My excitement was barely containable. I am sure I annoyed all of my co-workers when I shared the news with them. I would show you a reference, but that magazine does not keep articles prior to 2003. This was 1998 or 1999.

Onward and upward…

Next post:

In this series, I will be getting personal. I will be sharing stories of major events (with lots of candor) that led to my career in this industry. A career which you may eventually be surprised to learn I am still extremely grateful to be a part of. Take what you like and leave the rest please.

Surviving Big Tech – Post 2

Hello Microsoft VB 6.0

Visual Basic 6.0

I am about to seriously date myself, but why not. Discovering Microsoft Visual Basic 6.0 (VB) – for me, was a big experience. I was just out of college and working as a programmer for a Health Company. At that time everything was driven off the mainframe. This is South Louisiana may I remind you – not Silicon Valley.

I hated the mainframe. I hated Cobol. But, I needed to make a living and find a way out of a life everyone I was working with had just accepted. Until the release of VB 6, I had no clue how to get there. I had dabbled with C++, but with no help, coming up to speed was like climbing Mount Everest. Neither was going to happen for me.

Microsoft VB was a game changer. It opened the door to a place I desperately wanted to be at. Installing the tool and building a few forms was like having a spiritual awakening. I was hooked. My employer thought there was no use for it. I kept pushing, until they eventually accepted it “might” offer some value.

A chance was all I needed. I spent as much time learning everything I could. Mostly, a lot of trial and error. This was right before Google was introduced. Back then, developers just had to figure things out by themselves. I know crazy stuff, right?

I eventually left the job doing Cobol and joined a financial company. They saw the appeal of VB and offered me a lot of money. What more could I ask for? This was a tough time because I was still learning so much and I made a LOT of mistakes. But, it was also a time of great excitement and a lot of growth. The growth was not just in my career, but my mid section as well. I had two children during this time.

Stay tuned…

Next Post:

In this series, I will be getting personal. I will be sharing stories of major events (with lots of candor) that led to my career in this industry. A career which you may eventually be surprised to learn I am still extremely grateful to be a part of. Take what you like and leave the rest please.

Surviving Big Tech – Post 1

In this series, I will be getting personal. I will be sharing stories of major events (with lots of candor) that led to my career in this industry. A career which you may eventually be surprised to learn I am still extremely grateful to be a part of. Take what you like and leave the rest please.

Humble Beginnings

When I was a child, I visited this place – located less than an hour from my home in Southern Louisiana. It is a former plantation and you will be grateful to know that this post has nothing to do with it’s history. It is just representative of the mindset and culture that dominated where I grew up. Enough said.

At an early age I quickly realized that I did not belong here. I was actually born in Manhattan New York. But, for various reasons my parents moved to this place when I was a young child and this is where I grew up and where my family continues to reside. I LOVE my family and that is why I have remained here. Enough said about that.

Pretty girls in Louisiana were not valued for their intellect. I knew that. So I hid mine – all through school until college. As a result, I scored terrible on the math portion of the ACT.

Entering College, I was desperate for something meaningful, as well as some money. I had neither at that time. Unfortunately, because I had scored so badly on the math portion of the ACT, my college advisor actually told me not to pick any major that was math oriented. By that age, I was pissed and defiant. I was not accepting that outcome.

I passed on the counselors well meaning advice and immersed myself in math classes. I had to start with remedial math (which was beyond humiliating). But I made it through quite easily. Turns out, when I actually tried, I was both good at math and liked it very much.

I graduated Louisiana State University with good grades in Quantitative Business Analysis with an option towards Computer Science. It was all math, with some computer science classes. I loved it. I was hooked. And thus my career began.

Stay tuned….

Next Post:

Salesforce Security – Build a Record Sharing Model – Post 3

This post extends on Prepare for the Salesforce Sharing and Visibility Exam – Understand the Salesforce Sharing Model – Post 2, in which I introduced you to the Salesforce Record Sharing model for internal users. In this post, I will walk you through what to do to create a record sharing model in your own Salesforce org.

In this series, I will be sharing highlights from courses that are part of the Salesforce Certified Sharing and Visibility Designer Skill Path on Pluralsight. The skill path is designed to help anyone trying to pursue the Salesforce Sharing and Visibility Designer certification.

Setting Org-wide Defaults

Org-wide defaults (OWD) provide a default level of access for records owned by users. More importantly, they can be used to limit data access for each standard or custom object. You can set different levels for internal versus external users.

Default levels:

  • Private is the most restrictive and means that only the record owner and users above them in the role hierarchy can view or edit records.
  • Public read only allows all users to view records, but not edit them.
  • Public read/write is the least restrictive level and means full access for all users.
  • Public read/write/transfer is used only for case and lead records, since these types of records can be transferred to another owner.

For each object there is a “Controlled by parent” checkbox. As you might guess, this affects objects that are children. They will inherit the access level of the parent.

Every time that user attempts to access a record from a particular object, Salesforce will first check the OWD. If it is set as private or read/only, the system will look at the object’s sharing table and join the group membership table based on the ID of the user trying to access the record.

Depending on what is found, the least restrictive access will be granted. The access grant, or sharing row cause will be stored as a record in the sharing tables that I told you about in the last post. This is the record access calculation process.

Designing a Role Hierarchy

Role hierarchies provide data access for a group of users. It allows managers access to records owned by their employees. Essentially, record access is opened up vertically to users higher up in the hierarchy. By default, peers or other members assigned to the role will not have access to these records.

…role hierarchy should NOT be a duplicate of your company org chart.

The role hierarchy is just one tool included in the Salesforce Sharing Model. The role hierarchy sits right in the middle. This means that baseline access, implicit sharing or org-wide defaults will override access provided by the role hierarchy.

Salesforce Internal Sharing Model
Salesforce Internal Sharing Model

When designing a role hierarchy, the following things should always be kept in mind:

  • Your role hierarchy should NOT be a duplicate of your company org chart.
  • Users should be grouped into access levels. Only users assigned to roles above them will have the same access as the record owner.
  • Roles should only be created for permanent groups of users and not a group that is considered temporary because changes cause expensive sharing recalculations to take place.
  • Orgs created prior to the Spring 21 release are limited to 500 roles.

Sharing Rules and Manual Sharing

It is important to realize that the entire Salesforce sharing model is model is built around record ownership. When a user creates a new record, they automatically become the record owner. Records can also be assigned to queues.

…the entire Salesforce sharing model is model is built around record ownership

Sharing rules are created for objects, but you only do this if the OWD for that object is set to private or read-only. Otherwise, there is no need to create a sharing rule since everyone has access to that object. They open up object access for a select group of users.

When thinking about selecting a group of users, they can be assigned to a public group, which is not the same things as a queue. A public group consists of one or more users. These users can be individual users, or they can belong to a role or territory. Users that are part of a group cannot own a record, but they can be part of a sharing rule.

Queues on the other hand are generally used to manage ownership of objects such as leads, cases, and even custom objects. The record is owned by the queue and not any of the users assigned to the queue. However, queues are not used in sharing rules. Sharing rules are configured with public groups or roles.

Manual sharing happens when one user wants to share a record with another user. They can do this by clicking a button in the user interface. But keep in mind that manual sharing is only available for Accounts, Contacts, Cases, opportunities, leads and custom objects.

Only certain users can grant this kind of access. This includes, the:

  • Record owner
  • Users above the record owner in the role hierarchy
  • Any user that has been granted full record access
  • Administrators

Manual sharing is primarily used for special exceptions. That is why it sits above sharing rules in that upside down triangle you saw earlier.

Stay tuned for upcoming posts in this series and you may want to checkout the Salesforce Certified Sharing and Visibility Designer Skill Path on Pluralsight.

Prepare for the Salesforce Sharing and Visibility Exam – Understand the Salesforce Sharing Model – Post 2

Welcome back to the second post for this series. In this post you will be introduced to the Salesforce Record Sharing Model for internal users. This will involve sharing Salesforce data across users, groups and roles.

In this series, I will be sharing highlights from courses that are part of the Salesforce Certified Sharing and Visibility Designer Skill Path on Pluralsight. The skill path is designed to help anyone trying to pursue the Salesforce Sharing and Visibility Designer certification.

Diagnosing User Access Issues

Most Salesforce Administrators will eventually be asked why some user cannot access certain data. In situations such as these, it is helpful to refer to a diagram such as the one below.

Salesforce Sharing Model for Internal Users
Salesforce Sharing Model for Internal Users

The Salesforce sharing model can be imagined as an upside down triangle. Tools at the top of the triangle provide the widest level of access to the greatest number of users. Tools located at the bottom can be used more precisely to grant certain kinds of access to the least number of users.

When trying to figure out user access problems, you would start at the bottom. In other words, baseline access should be the first thing you look at, followed by implicit sharing and so on. Keep going up the triangle until you determine what the problem is.

Sharing ToolDescription
Baseline AccessInvolves a combination of a profile permissions along with permission sets.
Implicit SharingSalesforce’s built-in sharing behavior between account and child records (contacts, cases and opportunities)
Org-wide Defaults (OWD’s) Determines an objects’ default access level and is the only way to limit record level access.
Role HierarchyExpands data access vertically. Allows managers to access records owned by the users they manage.
Sharing RulesDefine criteria for sharing access with specific users or users in public groups/roles
Manual SharingTypically used for special circumstances, users can intentionally grant record access to a user that would not normally have access.
Team AccessUsed to grant access to teams, which are groups of users that work together on objects like accounts, opportunities or cases.
Territory Hierarchy AccessUsed to manage and grant account access to users assigned to sales territories.
Salesforce Record Sharing Tools

I am not going to lie to you, the Salesforce record sharing model is complex. But don’t be overwhelmed because throughout this series, I will be introducing you to all of these tools.

Working with Access Grants

Access grants are what Salesforce uses to determine who sees what data. The process of determining this all starts with an object sharing table. Object Sharing tables are completely separate from the object table itself where all the Salesforce data lives, such as the information about an account lives.

Sharing tables will store information about the grant (or sharing type) such as whether it is explicit or implicit. Implicit grants happen when there are children records associated with a parent.

For instance, accounts and contacts are designed with this kind of relationship. Contacts are considered children of a parent account and understandably users that can access a contact can also access the account.

Implicit grants will override explicit grants, which happen when a record is shared with manual sharing or sharing rules. So it is important to always keep implicit grants, or implicit sharing in the back of your mind. If you ever have a situation where you cannot figure out why a user is accessing a record, consider implicit sharing.

If you ever have a situation where a user is accessing a record you think they should not have access to, consider implicit sharing.

Object sharing tables are created automatically and follow a very specific naming pattern. For example, when the object record table is named Account, the sharing table will be named AccountShare. And the thing that ties these two tables together is the owner of the record. When dealing with a custom object such as one named myCustomObject, the sharing table will have the object name followed by two underscores and the word Share.

Determining what values go into a sharing table occurs when record access is calculated. This is a separate process from when a user attempts to access the actual record in the user interface or with an API. This process only happens during a configuration change, such as creating a new custom object. And you should know that It is a very complicated resource intensive process known as sharing recalculation.

Things are done this way in order to improve record access performance. If all this checking was done in real time, the system would not perform well at all and users would be very unhappy.

Record access calculations can happen when changes are made to:

  • Group membership
  • Role hierarchies
  • Territory hierarchies
  • Kicked off manually by an Administrator

It is important to realize that record access calculations can act like a ripple effect in a Salesforce org. For this reason, large orgs should be especially careful when kicking this off or making any changes that might trigger it. The process could negatively impact the orgs performance.

Since this is such a HUGE topic, I will not be covering everything in this one post. Stay tuned for the next post where I will be covering more.

Prepare for the Salesforce Certified Sharing and Visibility exam – Getting Started – Post 1

In this series, I will be sharing highlights from courses that are part of the Salesforce Certified Sharing and Visibility Designer Skill Path on Pluralsight. The skill path is designed to help anyone trying to pursue the Salesforce Sharing and Visibility Designer certification.

The first course in the series, Salesforce Security: Getting Started, will use complex customer scenarios for a global robotics provider named Globomantics. You will learn how to evaluate the use of object, field-level, role and security settings to secure the Salesforce org.

Salesforce Security Levels

From a high level, Salesforce enforces security through multiple layers or levels. At the outermost layer there is Organization access. This is controlled by login tools that allow you to control when and how a user logs in.

High-level overview of the different Salesforce security layers
High-level overview of the different Salesforce security layers

Once logged in, access to objects and fields is controlled through a combination of profiles and permission sets. Profiles must be assigned to each user and permission sets are assigned to specific users.

Record level sharing sits between the object and field level. Once a user is granted access to an object, then specific data records can be shared with them. But access can be restricted on a field level basis, providing even more granular access.

At the record level, access is controlled through data sharing. For each object, this access can be restricted through Org-wide defaults (OWD’s), which apply to all users. Roles and sharing can then be used to open this access back up to certain users.

Controlling Access to a Salesforce Organization

Controlling access to a Salesforce organization (or org) is the first line of defense. Authentication is the process of verifying that a user is who they say they are.

Salesforce authentication covers a broad spectrum of available tools where tools on the left are the least complex, such as passwords associated with a username, and tools on the right are the most complex and offer tighter security.

Salesforce login tools on the left-side are the least complex
Salesforce login tools on the left-side are the least complex

Multi-factor authentication (MFA), which you may know as 2-factor authentication involves verifying a user’s identity with two or more pieces of info. Starting in 2022, Salesforce will require customers to use MFA for internal users.

Network-based security deals with limiting where and when a user can login to Salesforce. Device activation (or identity confirmation) involves tracking information about the device used to verify a user’s identity.

Every computer or device that is connected to a public network is assigned a unique IP (Internet Protocol) address, such as When a user logs in to Salesforce for the first time they are sent an activation email that is associated with that address.

When the Salesforce user clicks a link in that email using the same device they logged in with they will be directed back to Salesforce and the device used will be activated and considered secure. Salesforce Admins can allow users logging in from an internal network access without having to activate a device by setting a trusted IP range. This is done at the org-level and applies to all internal users.

Besides setting a trusted IP range, Administrators can restrict a user’s login IP range, along with specific login hours through permissions assigned to user’s at the profile level. By setting a login IP range, all user’s assigned to that profile, will not be able to login from an IP address outside of that range.

So remember, adding a Trusted IP ranges at the org level will only remove computer activation requirements for users logging in from a device within that range. Login IP ranges, which are applied at the profile level will instead prevent a user from logging in from a device outside the range.

Controlling Access to Object and Fields

Salesforce launched their CRM product back in 2000 and at that time the only way to grant user permissions to objects and fields was through the profile, which had a one to one relationship with the user object. As the platform grew, along with the number of permissions, things got much harder to manage.

Salesforce eventually introduced permission sets as a way to alleviate the pain points that were associated with profiles. Permission sets can be assigned to more than one user and for a while they were easier to manage.

Over time, as some orgs got bigger and had to manage lots of permission sets, even these became hard to keep up with. And so, Salesforce introduced the concept of permission set groups.

As a best practice, Administrators should use permissions or permission set groups to grant permissions and not the profile. In a nutshell, profiles should be used to restrict access and permission sets or groups should be used to grant access to specific users.

As a best practice, Administrators should use permissions or permission set groups to grant permissions and not the profile.

When thinking about granting permissions to a user, a best practice involves the principle of least privilege. This means that users should only be given the minimum permissions necessary to do their job. By keeping this in mind when approaching permissions you can be ensured that users are able to do their jobs, while also protecting the integrity of the entire org.

Access to objects are granted through CRUD permissions. CRUD stands for Create, Read, Update and Delete. Salesforce allows you to assign two other permissions to objects (View All and Modify All) that you might like to think of as super power permissions. For this reason, these permissions should be given sparingly and typically only to Administrators.

Fields access is handled with Field-level security or FLS, as it is better known. Where objects can be granted CRUD permissions, fields can only be granted read or edit access. Fields also do not have access to the super power permissions of “view all” or “modify all”.

It is important to realize that object-level access overrides FLS. For example, you cannot remove read permission for an object and then enable it for all that object’s fields. If the user does not have read access to an object, then the user can not see any of the object’s fields.

Controlling Access to Salesforce Records

Access to Salesforce records resolves around the concept of record ownership. Each record or row of data as you might like to think of it can be owned by an individual user or a group of users assigned to a queue.

Org-wide defaults (OWD’s) are the only way you have of limiting record access. All the other tools open up access to users through sharing tools or the role hierarchy.

OWD’s set the default level of access users have to each others records. Each object will be assigned a level (private, public read only or public read/write) and these can be different for internal versus external users. Public read/write transfer is a special level that is only available to lead and case objects since those are the only kinds of records that allow for a transfer.

The role hierarchy opens up access to data records vertically. Access moves from the top of the hierarchy down to the bottom. This allows managers to have access to the records of employees they manage. But peers on the same level, do not have access to each others records.

The role hierarchy opens up access to data records vertically.

Besides the role hierarchy, there are other sharing tools that open up access restricted by OWD’s. Record sharing is a huge topic and for that reason it was given an entire course for this series. I only mention it briefly here so you know where it fits with other Salesforce security access levels.

For more detail about sharing records, you will need to check out my upcoming post about the next course in this series, Share Salesforce Data Across Users, Groups and Roles. Hope you stay with me.

New Course on Managing Account Access with Salesforce Territory Management

This course is one of several, designed to prepare you on your journey towards completing the Salesforce Certified Sharing and Visibility Designer Exam. In this course, you will learn how to successfully setup a Territory model using Salesforce’s Enterprise Territory Management.

By the end of the course, you will know how to do the following:

  1. Define Salesforce Enterprise Territory Management
  2. Understand the benefits of using Enterprise Territory Management
  3. Understand how to setup Territory Management in Salesforce
  4. Explain the capabilities of Enterprise Territory Management and its impact on data visibility
  5. Explain how Enterprise Territory Management impacts forecasting in Salesforce
Globomantics is a robotics company that needs to configure Salesforce Enterprise Territory Management
Globomantics is a robotics company that needs to configure Enterprise Territory Management

The fictional company at the center of the story line for this course and the whole series will be Globomantics. The main character of the story will be a Salesforce Architect that has been asked to design the security model for this Salesforce Enterprise customer. The story for this course will be that the Architect is continuing her analysis of their org and is now assisting on building territories to help with account management.

If you are able to check out the course, I would love to hear your feedback. Especially, if you are using this course to help prepare you for the certification exam.

Why Lightning Web Runtime Is a Game Changer for All Web Developers

Lightning Web Runtime (LWR)

Salesforce’s annual Dreamforce 2021 was different in many ways. A very select number of people were invited to attend (thanks COVID). Most of Dreamforce was done virtually for the widest audience possible. For this reason, the content delivered came off more as an informercial and did not have the meat that Dreamforce attendees are used to. I kept waiting for a 1-800 number to appear on the screen. It had to be said.

What was the same was that there are always lots of big announcements. But due to the infomercial style, most Salesforce developers probably missed the biggest announcement Lightning Web Runtime is available as an NPM package.

So what does this really mean? In my opinion, it is a game changer for Salesforce developers, but possibly all web developers.

For starters, the Lightning Web Runtime (LWR) means that Lightning Web Components (LWC’s) can be used to assemble web applications that render a full page in less than one second. Yes, that is what I said. BOOM!

This means that Lightning Apps can actually render as fast as we have all been hoping for since the word “Lightning” was introduced. Will the real modern high performance web architecture please stand up?

Thanks to page generation at build time, not runtime, our bar is set at subsecond full-page loads.

LWR Docs –

But the best part is that because of the release of the NPM package, serious web developers can build high performing web apps that may or may not connect to Salesforce. I’m serious. This is the flexibility that Salesforce developers have been wanting forever.

What is coming? Open Source, but no time frames on when that will be. I also imagine there will be more features and hopefully lots of demo apps.

My advice to you is to “Run, don’t walk” to the LWR guide as soon as possible. I know I am. And stay tuned here, because there is so much more to come.

Post 3 – Communicate Between Unrelated Salesforce Lightning Web Components

This is the latest series of posts I am doing on a new Pluralsight course I created titled, “Communicate Between Salesforce Lightning Web Components”. You can also check out a free 10 minute preview video that explains how I got to this point in the post series.

Welcome Back

In the last post, I showed you a solution to display products, but only focused on how parent to child communication worked in that example. In this post, I will point out the portions of the code that involve working with an unrelated component and show you the unrelated component that displayProducts will be working with.

Just to refresh your memory, the code for displayProducts is as follows:

<!-- displayProduct.html -->

    <div class="slds-card slds-var-p-around_x-small">
            label="Search Key"
        <template if:true={}>
          <template if:true={}>
             <div class="content">
                <template for:each={}
                     <!-- Product Tile Component here-->
            <template if:false={}>
                <p>No products matching your selection</p>
        <template if:true={products.error}>
            <p>Error when getting product data</p>

// displayProducts.js

import { LightningElement, wire } from 'lwc';

import { publish, MessageContext } from 'lightning/messageService';
import PRODUCT_SELECTED_MESSAGE from '@salesforce/messageChannel/ProductSelected__c';

// getProducts() method in ProductController Apex class
import getProducts from '@salesforce/apex/ProductController.getProducts';

export default class DisplayProducts extends LightningElement {

    searchKey = '';     
    // Load context for Lightning Messaging Service 
    @wire(MessageContext) messageContext;

    //Publish ProductSelected message
    handleProductSelected(event) {  
        publish(this.messageContext, PRODUCT_SELECTED_MESSAGE, {
            productId: event.detail

    // Load the list of available products.
    @wire(getProducts, { searchKey: '$searchKey' })

    handleSearchKeyChange(event) {
        this.searchKey =;


Notice in the code above that each productTile uses an onselected event to fire off the handleProductSelected function. Also notice that at the top of the JavaScript controller, publish and MessageContext functions are imported from lightning/messageService. These are required when working with Lightning Message Service (LMS).

Working with Lightning Message Service

Lightning Message Service (LMS) was introduced in Summer 20 and it offers a standard way to communicate across the DOM, or Document Object Model. LMS can be used with LWC’s, along with Aura components and even Visualforce pages – as long as they are contained within the same lightning page.

The Lightning App Page that is used to host the solution has two regions. displayProducts will be loaded into the left-hand region and on the right will be a new component named productCard.

When a user clicks a product tile in displayProducts, a message will be sent to the productCard component, allowing it to display additional detail information. Communicating with unrelated components allows a user to easily switch between the tiles and additional product information without having to navigate to an actual Salesforce detail page.

Steps to work with LMS:

  1. Create a message channel.
  2. Define scope of message channel.
  3. Publish to the message channel.
  4. Component that should receive data will subscribe to the message channel

Below is the code for the productCard component (which is the unrelated component in this example):

<!-- productCard.html -->

    <lightning-card icon-name="standard:apex_plugin">
        <template if:true={recordId}>
            <span slot="title">{productName}</span>
            <div class="slds-var-m-horizontal_medium">
                    alt="Product picture"
        <template if:false={recordId}>
            <div class="slds-var-p-around_large">
                <p class="placeholder">Select a product to see details</p>

//  productCard.js

import { LightningElement, wire } from 'lwc';

// Lightning Message Service and a message channel
import { NavigationMixin } from 'lightning/navigation';
import { subscribe, MessageContext } from 'lightning/messageService';
import PRODUCT_SELECTED_MESSAGE from '@salesforce/messageChannel/ProductSelected__c';

// Utils to extract field values
import { getFieldValue } from 'lightning/uiRecordApi';

// Product__c Schema
import PRODUCT_OBJECT from '@salesforce/schema/Product2';
import NAME_FIELD from '@salesforce/schema/Product2.Name';
import PRODUCT_CODE_FIELD from '@salesforce/schema/Product2.ProductCode';
import FAMILY_FIELD from '@salesforce/schema/Product2.Family';
import MSRP_FIELD from '@salesforce/schema/Product2.MSRP__c';
import DESCRIPTION_FIELD from '@salesforce/schema/Product2.Description';
import PICTURE_URL_FIELD from '@salesforce/schema/Product2.Picture_URL__c';

 * Component to display details of a Product__c.
export default class ProductCard extends NavigationMixin(LightningElement) {
    // Exposing fields to make them available in template
    familyField = FAMILY_FIELD;
    msrpField = MSRP_FIELD;
    productCodeField = PRODUCT_CODE_FIELD;
    descriptionField = DESCRIPTION_FIELD;

    // Id of Product__c to display

    // Product fields displayed with specific format

    /** Load context for Lightning Messaging Service */
    @wire(MessageContext) messageContext;

    /** Subscription for ProductSelected Lightning message */

    connectedCallback() {
        // Subscribe to ProductSelected message
        this.productSelectionSubscription = subscribe(
            (message) => this.handleProductSelected(message.productId)

    handleRecordLoaded(event) {
        const { records } = event.detail;
        const recordData = records[this.recordId];
        this.productName = getFieldValue(recordData, NAME_FIELD);
        this.productPictureUrl = getFieldValue(recordData, PICTURE_URL_FIELD);

     * Handler for when a product is selected. When `this.recordId` changes, the
     * lightning-record-view-form component will detect the change and provision new data.
    handleProductSelected(productId) {
        this.recordId = productId;

    handleNavigateToRecord() {
            type: 'standard__recordPage',
            attributes: {
                recordId: this.recordId,
                objectApiName: PRODUCT_OBJECT.objectApiName,
                actionName: 'view'

Notice in the HTML code above (which is bolded), there is a lightning button icon component. If clicked the user will be directed to the products record page. This is just in case that is what the user would prefer to happen. Displaying the record detail information will be done using the lightning record view form base component, along with several output field base components.

Also notice in the Javascript code, import statements to access functions from the lightning message service, as well as the newly created message channel. The next import statement will include the uiRecordApi utility that allows me to extract field values for product fields, along with several import statements used to access schema metadata for specific Product2 fields.

Inside the class, I will begin by adding some local variables that will expose some of that schema metadata. Below that I need to define the scope of the message context, which in this case will just be the default of the active area. This is only available to me when using the @wire context.

Next is the handleRecordLoaded handler which is fired when the lightning record view form base component first loads. It will get the product record ID passed in the event message channel and use the uiRecordApi to extract the field values for the product name and picture URL.

The end result of this solution are two unrelated components that communicate seamlessly. The image below shows you what it looks like to have the components work together in a Lightning page for Experience Cloud.

Final rendered solution in Salesforce Experience Cloud
Final rendered solution in Experience Cloud

The code and explanations in this post is just a sample of all that is covered in the Pluralsight course, Communicate Between Salesforce Lightning Web Components”. Check that out if you are interested in learning more about how LWC’s communicate. You can also check out a free 10 minute preview video that explains how I got to this point in the post series.