Business
Chase Bank Manager’s Advice for Avoiding Scams and Fraud
To spot fraud, it’s important to learn more about the most widespread scams, how to prevent them and what to do if you think you may be a victim. We sat down with Myesha Brown, local community manager from the Chase branch on 3005 Broadway, to help you become more aware of the most common scams out there and what steps you can take so that you can protect your information and keep your hard-earned money safe.

Chances are you know someone who has been a victim of fraud or scams. As a result of the pandemic, fraudsters are finding new ways to find new victims. The good news is that there are simple ways for consumers to stay safe and fight back.
Scammers are always coming up with new ways to get you to part with your money. In a recent Chase survey of 2,000 consumers, 84% of survey respondents agreed that scams and scammers have become more sophisticated in recent years.
To spot fraud, it’s important to learn more about the most widespread scams, how to prevent them and what to do if you think you may be a victim.
We sat down with Myesha Brown, local community manager from the Chase branch on 3005 Broadway, to help you become more aware of the most common scams out there and what steps you can take so that you can protect your information and keep your hard-earned money safe.
Oakland Post: How common are fraud and scams?
Brown: Fraud cases are unfortunately becoming more prevalent, and more sophisticated. In fact, the FTC revealed that 3.5 million people reported being a victim of fraud or identity theft in 2020, an increase of more than 1.5 million from the previous year. For Black communities, the problem is even worse with nearly half (40%) of Black adults being targeted by online scams and fraud, according to AARP.
Oakland Post: What kind of scams exist that we should be aware of and know how to stop them?
While new scams are always popping up, there are several common approaches that keep resurfacing year after year.
Some of the more common scams we’ve encountered and helped our customers fight against may not always seem so obvious at first if you’ve been targeted.
What shocks many of our customers is how far scam artists are willing to impersonate familiar faces, whether that be close relatives or community officials, and also be able to trick you so easily by using your own smart phone against you. So next time you receive a text message or email on your phone, think twice before you engage.
Here is what I mean:
Fake bank fraud specialist
What they look like: Consumers receive a fraud alert via text message that appears to come from their bank. The message asks them to validate whether they made a certain purchase or sent a certain amount of money. After saying “no,” the recipient gets a call from someone claiming to be from their bank’s fraud team. The phone number may even appear to be a real phone number from your bank.
They’ll ask for the customer’s banking username, password or a one-time passcode. Alternatively, they’ll sometimes ask the customer to send money to themselves or a third party to “stop” the fraud or to get their money back. Once the scammer has gained access to a person’s account or convinced them to send money, they usually stop contact and the victim’s money is gone.
How to stop them: Unfortunately, scammers target consumers from many banks and they are very good at disguising themselves by “spoofing” or making their phone number appear legitimate. Consumers should never share their banking password, one-time passcode, ATM pin or send money to someone who says that doing so will prevent fraud on their account. Bank employees won’t call, text or email consumers asking for this, but crooks will. If you receive a call like this, hang up and call the phone number on your account statement, the back of your credit or debit card or bank website to verify the authenticity of the request.
Imposter scams
What they look like: Someone will call or email you claiming to be from an organization you trust, like the Internal Revenue Service. They may threaten you by saying that if you don’t pay taxes or fees owed, they’ll bring a lawsuit against you.
How to stop them: If you think there’s truly a possibility that you owe money, don’t pay it to someone who initiates a call or email to you. Instead, hang up and call the organization in question directly.
Grandparent scams
What they look like: You’ll get an email from a grandchild (or other relative) saying that they’re in trouble and need money fast.
How to stop them: Call your relative directly. If you can’t reach them, contact another relative who knows them and may know their whereabouts and circumstances. Whatever you do, don’t send money, purchase gift cards, or share any of your personal information, including your banking username and password. Scammers use threats and try to create a sense of urgency to trick you. Always trust your gut and end communication when something seems off.
Oakland Post: What should you do to stop scam artists?
Brown: There are steps you can take to protect yourself.
And while we’ve given this advice before in this newspaper, it is worth every cent to repeat in these pages. Here’s what we recommend you do and don’t:
DO:
- Educate yourself on the most common scams. Fraudsters will use anything to their advantage — claiming to be from the IRS, pretending to offer tech support, baiting you with prizes or cash winnings — the sky’s the limit!
- Monitor credit score for free with Chase Credit Journey — you don’t even need to be a Chase customer to sign up! It will notify you if your data is compromised. Plus, you’ll receive critical alerts that help protect your credit and identity.
- Review your accounts closely if you believe you may have fallen for a scam. With Chase, you can also set up account alerts so you can be notified of transactions on your account.
DON’T:
- Click on suspicious links on emails or texts unless you’re sure it’s from a credible source. Only access your accounts through the bank’s mobile app or their website.
- Share personal information. Neither Chase nor any other bank will ever ask for your username, password, ATM pin, etc. when reaching out to you. Banks may ask for this information only when you call to discuss your account.
- Transfer money to someone claiming to be from your bank. Banks will never ask to send money via wire, check or other method to “stop or prevent fraud.”
- Pay someone using gift cards, especially when they claim to need them to remove a virus from your computer, stop fraud on your account or to buy plane tickets to come visit you.
Oakland Post: What more can you do to protect yourself from fraud and scams?
Brown: One of the most effective things you can do to prevent fraud is to regularly monitor your bank and credit card accounts so that you can be on the lookout for signs of unusual activity.
Your bank’s mobile app can give you easy access to self-service — tools that let you track your finances 24/7. If you spot something suspicious, immediately report any concerning activity to your bank.
Many banks, including Chase, also let you set up account alerts to help you detect unusual transactions to your bank or credit card by sending automatic notifications.
If you’re not sure if your bank or financial institution already offers these tools or services, be sure to ask.
If you believe that you may have been a victim of fraud or scams, there’s no need to feel embarrassed or ashamed. It can happen to anyone. What’s most important is to take immediate action.
Sponsored content from JPMorgan Chase & Co
To learn more about common scams and how to stop scammers in their tracks visit: www.chase.com/security-tips. You can also learn tips to identify and avoid financial abuse by visiting: www.chase.com/financialabuse.
Activism
California Rideshare Drivers and Supporters Step Up Push to Unionize
Today in California, over 600,000 rideshare drivers want the ability to form or join unions for the sole purpose of collective bargaining or other mutual aid and protection. It’s a right, and recently at the State Capitol, a large number of people, including some rideshare drivers and others working in the gig economy, reaffirmed that they want to exercise it.

By Antonio Ray Harvey
California Black Media
On July 5, 1935, President Franklin D. Roosevelt signed into federal law the National Labor Relations Act (NLRA). Also known as the “Wagner Act,” the law paved the way for employees to have “the right to self-organization, to form, join, or assist labor organizations,” and “to bargain collectively through representatives of their own choosing, according to the legislation’s language.
Today in California, over 600,000 rideshare drivers want the ability to form or join unions for the sole purpose of collective bargaining or other mutual aid and protection. It’s a right, and recently at the State Capitol, a large number of people, including some rideshare drivers and others working in the gig economy, reaffirmed that they want to exercise it.
On April 8, the rideshare drivers held a rally with lawmakers to garner support for Assembly Bill (AB) 1340, the “Transportation Network Company Drivers (TNC) Labor Relations Act.”
Authored by Assemblymembers Buffy Wicks (D-Oakland) and Marc Berman (D-Menlo Park), AB 1340 would allow drivers to create a union and negotiate contracts with industry leaders like Uber and Lyft.
“All work has dignity, and every worker deserves a voice — especially in these uncertain times,” Wicks said at the rally. “AB 1340 empowers drivers with the choice to join a union and negotiate for better wages, benefits, and protections. When workers stand together, they are one of the most powerful forces for justice in California.”
Wicks and Berman were joined by three members of the California Legislative Black Caucus (CLBC): Assemblymembers Tina McKinnor (D-Inglewood), Sade Elhawary (D-Los Angeles), and Isaac Bryan (D-Ladera Heights).
Yvonne Wheeler, president of the Los Angeles County Federation of Labor; April Verrett, President of Service Employees International Union (SEIU); Tia Orr, Executive Director of SEIU; and a host of others participated in the demonstration on the grounds of the state capitol.
“This is not a gig. This is your life. This is your job,” Bryan said at the rally. “When we organize and fight for our collective needs, it pulls from the people who have so much that they don’t know what to do with it and puts it in the hands of people who are struggling every single day.”
Existing law, the “Protect App-Based Drivers and Services Act,” created by Proposition (Prop) 22, a ballot initiative, categorizes app-based drivers for companies such as Uber and Lyft as independent contractors.
Prop 22 was approved by voters in the November 2020 statewide general election. Since then, Prop 22 has been in court facing challenges from groups trying to overturn it.
However, last July, Prop 22 was upheld by the California Supreme Court last July.
In a 2024, statement after the ruling, Lyft stated that 80% of the rideshare drivers they surveyed acknowledged that Prop 22 “was good for them” and “median hourly earnings of drivers on the Lyft platform in California were 22% higher in 2023 than in 2019.”
Wicks and Berman crafted AB 1340 to circumvent Prop 22.
“With AB 1340, we are putting power in the hands of hundreds of thousands of workers to raise the bar in their industry and create a model for an equitable and innovative partnership in the tech sector,” Berman said.
Activism
Newsom Fights Back as AmeriCorps Shutdown Threatens Vital Services in Black Communities
“When wildfires devastated L.A. earlier this year, it was AmeriCorps members out there helping families recover,” Gov. Newsom said when he announced the lawsuit on April 17. “And now the federal government wants to pull the plug? We’re not having it.”

By Bo Tefu
California Black Media
Gov. Gavin Newsom is suing the federal government over its decision to dismantle AmeriCorps, a move that puts essential frontline services in Black and Brown communities across California at risk, the Governor’s office said.
From tutoring students and mentoring foster youth to disaster recovery and community rebuilding, AmeriCorps has been a backbone of support for many communities across California.
“When wildfires devastated L.A. earlier this year, it was AmeriCorps members out there helping families recover,” Newsom said when he announced the lawsuit on April 17. “And now the federal government wants to pull the plug? We’re not having it.”
The Department of Government Efficiency (DOGE) under the Trump administration is behind the rollback, which Newsom calls “a middle finger to volunteers.”
Meanwhile, Newsom’s office announced that the state is expanding the California Service Corps, the nation’s largest state-run service program.
AmeriCorps has provided pathways for thousands of young people to gain job experience, give back, and uplift underserved neighborhoods. Last year alone, over 6,000 members across the state logged 4.4 million hours, tutoring more than 73,000 students, planting trees, supporting foster youth, and helping fire-impacted families.
The California Service Corps includes four paid branches: the #CaliforniansForAll College Corps, Youth Service Corps, California Climate Action Corps, and AmeriCorps California. Together, they’re larger than the Peace Corps and are working on everything from academic recovery to climate justice.
“DOGE’s actions aren’t about making government work better. They are about making communities weaker,” said GO-Serve Director Josh Fryday.
“These actions will dismantle vital lifelines in communities across California. AmeriCorps members are out in the field teaching children to read, supporting seniors and helping families recover after disasters. AmeriCorps is not bureaucracy; it’s boots on the ground,” he said.
Activism
AI Is Reshaping Black Healthcare: Promise, Peril, and the Push for Improved Results in California
Black Californians experience some of the worst health outcomes in the state due to systemic inequities, limited healthcare access, and exclusion from medical research. 16.7% of Black adults report fair or poor health, versus 11.5% of Whites. Black adults have the highest death rates from prostate, breast, colorectal, and lung cancer. Statewide, diabetes affects 13.6% of Black adults versus 9.1% of Whites, and 27% of Black adults over 65 have heart disease, compared to 22% of Whites. Life expectancy for Black Californians is about five years shorter than the state average.

Joe W. Bowers Jr.
California Black Media
Artificial intelligence (AI) is changing how Californians receive medical care – diagnosing diseases, predicting patient needs, streamlining treatments, and even generating medical notes for doctors.
While AI holds promise, it also poses risks, particularly for Black patients. It can provide faster diagnoses and expand access to care, but it may also misdiagnose conditions, delay treatment, or overlook patient’s critical needs. AI’s impact on Black patients depends on how biases in medical data and algorithms are addressed in its development.
“As we progress toward a society with increased use of AI technology, it is critical that the biases and stereotypes that Black Americans have faced are not perpetuated in our future innovations,” said Dr. Akilah Weber Pierson (D – San Diego), a physician and state senator spearheading legislative efforts to address AI bias in healthcare.
Why AI Matters for Black Californians
Black Californians experience some of the worst health outcomes in the state due to systemic inequities, limited healthcare access, and exclusion from medical research. 16.7% of Black adults report fair or poor health, versus 11.5% of Whites. Black adults have the highest death rates from prostate, breast, colorectal, and lung cancer. Statewide, diabetes affects 13.6% of Black adults versus 9.1% of Whites, and 27% of Black adults over 65 have heart disease, compared to 22% of Whites. Life expectancy for Black Californians is about five years shorter than the state average.
Benefits and Risks of AI in Healthcare
AI processes vast amounts of medical data using computer algorithms designed to identify patient health patterns, helping doctors to diagnose diseases, recommend treatment, and increase patient care efficiency. By analyzing scans, lab results, and patient history, AI can detect diseases
earlier, giving it the potential to improve care for Black patients, who face higher risks of prostate cancer, diabetes, heart disease and hypertension.
Dr. Judy Gichoya, an Interventional radiologist at the Emory University Winship Cancer Institute and AI researcher at Emory’s Healthcare AI Innovation and Translational Informatics (HITI) Lab, sees AI as a tool with great potential but cautions that its effectiveness depends on the diversity of the data it is trained on. She says, “Without diverse datasets, AI could overlook critical signs of diseases, especially in underrepresented populations like Black patients.”
Dr. Timnit Gebru, a computer scientist and AI ethics expert, is the founder and Executive Director of DAIR (Distributed AI Research Institute) in Oakland. She has extensively studied bias in AI systems and their impact on marginalized groups.
Gebru acknowledges that AI has the potential to improve healthcare by enhancing efficiency and expanding access to medical resources. But, like Gichoya she strongly stresses that for AI to be effective and equitable it needs to be subject to rigorous oversight.
AI is already helping doctors personalize cancer treatment by identifying biomarkers and genetic mutations. UCSF and Stanford Health use AI to analyze tumor DNA to match patients with the most effective chemotherapy or immunotherapy.
In diabetes care, AI predicts blood sugar fluctuations, helping doctors adjust treatment. It helps radiologists in early disease detection and identifies sepsis sooner, reducing hospital deaths. In cardiology, AI detects early signs of heart disease, spotting plaque buildup or abnormal heart rhythms before symptoms appear. It also helps predict strokes by analyzing brain scans to determine risk and guide intervention.
Kaiser Permanente uses AI scribes to reduce paperwork and improve patient interactions. Covered California has partnered with Google
Cloud to use AI to streamline document verification and eligibility decisions.
Despite these advancements, AI systems trained on biased medical data can perpetuate inequities for Black patients.
Gebru explains, “If AI learns from historically discriminatory medical decisions—such as undertreating Black patients—it will scale those biases.”
A notable example is in dermatology, where AI frequently misdiagnoses conditions in Black patients because most training datasets are based on lighter-skinned individuals. “Melanoma looks very different on darker skin,” Gebru notes. “It’s not just darker—it often appears differently, like under toenails, a pattern AI trained mostly on lighter skin won’t detect.”
Another risk of AI in healthcare is automation bias, where healthcare providers over-rely on AI, even when it contradicts medical expertise. “Doctors who would have prescribed medications accurately without AI sometimes make mistakes while using automated tools because they over-trust these systems,” Gebru adds.
AI-driven health insurance claim denials are a growing concern. UnitedHealthcare faces a class-action lawsuit for allegedly using an unregulated AI algorithm to deny rehabilitation coverage to elderly and disabled patients.
Beyond bias, AI also poses an environmental threat. AI systems require enormous amounts of energy for computing and massive amounts of water to cool data centers, which exacerbates climate change, an issue that already disproportionately impacts Black communities.
Trump Administration and DEI Impact
The Trump administration’s efforts to dismantle Diversity, Equity, and Inclusion (DEI) threatens funding for AI bias research in healthcare.
Less federal support could stall progress in making AI systems fairer and more accurate, increasing discrimination risks for Black patients.
California’s Legislative and Regulatory Response
Recognizing AI’s risks in healthcare, California lawmakers and state officials are implementing regulations. Weber Pierson introduced Senate Bill (SB) 503 to ensure that AI algorithms used in healthcare are tested for racial bias before implementation.
“We’ve already seen how biased medical devices like pulse oximeters can fail Black patients,” Weber Pierson explains. “If algorithms used in patient care aren’t inclusive, they’re not going to accurately serve melanated individuals.”
At a press conference introducing SB 503, Weber Pierson stressed that AI must be held accountable. “This bill focuses on ensuring that software used as an accessory to healthcare staff delivers sound, nondiscriminatory decisions that promote equitable outcomes.”
Other legislative efforts include Senate Bill (SB) 1120, by Sen. Josh Becker (D-Menlo Park), which stops insurance companies from using AI alone to deny or delay care and Assembly Bill (AB) 3030, by Assemblymember Lisa Calderon (D-Whittier), which requires healthcare providers to inform patients when AI is used in their care.
Attorney General Rob Bonta has issued a legal advisory barring AI from unfairly denying healthcare claims, falsifying records, or restricting access to care based on medical history. Gov. Gavin Newsom’s 2023 executive order directs state agencies to assess AI’s impact and establish consumer protections, particularly in healthcare.
Actions Black Patients and Families Can Take
As AI becomes more common in healthcare, Black Californians can ensure fair treatment by asking if AI is used, seeking second opinions, and supporting groups addressing algorithmic bias.
They can:
- Ask their healthcare providers whether AI played a role in their diagnosis or treatment.
- Request second opinions if an AI-generated diagnosis seems questionable.
- Advocate for AI policies and legislation promoting fairness and accountability. · Engage with community health organizations like the California Black Health Network (CBHN) that is engaged in ensuring AI is developed in ways to improve health outcomes for Black patients.
Rhonda Smith, CBHN’s executive director, says bias in medical algorithms must be eliminated. “There should never be any race-based adjustment in delivering patient care,” she said.
CBHN supports inclusive research and legislation like SB 503 to ensure AI promotes equity.
Ensuring AI Benefits All Communities
As a legislator, Weber Pierson is pushing for stronger safeguards to ensure AI serves all patients equitably. She says, “Innovation and technology are good, but new challenges arise if we don’t move in a direction inclusive and thoughtful of all people who utilize the healthcare space.”
AI has the potential to revolutionize healthcare, but experts warn it must be developed and regulated with transparency, accountability, and fairness – ensuring it reduces rather than worsens, racial health disparities.
-
Activism4 weeks ago
Oakland Post Endorses Barbara Lee
-
Activism4 weeks ago
Oakland Post: Week of April 2 – 8, 2025
-
Activism2 weeks ago
Oakland Post: Week of April 9 – 15, 2025
-
#NNPA BlackPress3 weeks ago
Trump Profits, Black America Pays the Price
-
#NNPA BlackPress3 weeks ago
Harriet Tubman Scrubbed; DEI Dismantled
-
#NNPA BlackPress3 weeks ago
Trump Targets a Slavery Removal from the National Museum of African-American History and Culture
-
#NNPA BlackPress3 weeks ago
New York Stands Firm Against Trump Administration’s Order to Abandon Diversity in Schools
-
#NNPA BlackPress4 weeks ago
Lawmakers Greenlight Reparations Study for Descendants of Enslaved Marylanders