Business
Black Women in Tech Share Concerns, Hopes About Artificial Intelligence Industry
A.I. floodgates opened into the mainstream of human consumption late last year with the release of the generative A.I. ChatGPT, which uses natural language procession to create humanlike conversational dialogue for public use. A.I.’s popularity has spearheaded discussions on how chatbots and other A.I. applications like face recognition and A.I. voice generator will impact the workforce, educational systems, entertainment, and individuals’ daily lives.

By McKenzie Jackson
California Black Media
Sofia Mbega’s first exposure to technology — more specifically, Artificial Intelligence (A.I.) — happened years before she moved from East Africa to the Golden State.
Mbega was a student at the University of Dodoma in Tanzania, when her mother, Gloria Mawaliza, suggested she take a technology course after learning about computer science from co-workers at the international children’s nonprofit World Vision.
Mbega, a Stockton resident since 2018, said taking courses in software engineering, and receiving a degree in 2015, was previously unheard of in Tanzania.
“We were the first batch of students,” Mbega said of herself and her classmates. “It was a new profession for my country.”
When she learned about A.I. systems, a topic that continues to grab headlines across the U.S. with experts and pundits wrestling with its merits and dangers, Mbega was intrigued.
“I was so excited,” she recalled. “But I did not picture things would be like this. I thought A.I. would only be something to help software engineers.”
The technology has moved well beyond that purpose.
A.I. floodgates opened into the mainstream of human consumption late last year with the release of the generative A.I. ChatGPT, which uses natural language procession to create humanlike conversational dialogue for public use.
A.I.’s popularity has spearheaded discussions on how chatbots and other A.I. applications like face recognition and A.I. voice generator will impact the workforce, educational systems, entertainment, and individuals’ daily lives.
Despite only accounting for a small percentage of the technology sector workforce, Black women like Mbega, a 31-year-old independent data analysis contractor, are constantly assessing the positives and negatives of A.I. and what it is like to work in the industry.
Mbega, a member of Black Women in A.I., a 3-year-old organization that aims to educate and empower Black women, says she is still excited about A.I., but alarm bells are ringing.
If you ask large language model-based chatbots like ChatGPT a question, they will answer. People have used A.I. to do draft emails, compose music, write computer code, and create videos and images.
Mbega worries that bad actors could use A.I. for nefarious reasons.
“Someone can make a video of someone saying a crazy or bad thing and people will believe it,” she said.
Oakland resident Joy Dixon, a software engineering manager at Hazel Health and the founder of Mosaic Presence Inc., is concerned about students becoming too dependent on A.I. to do educational tasks such as write papers and solve problems.
“How much is it really advancing them?” Dixon asked. “Is it doing us a disservice that we won’t see now, but maybe in five to 10 years?”
Her main concern with A.I. though is prejudices present in the technology.
“A.I. is built on models of people, and people have their own biases and challenges,” Dixon said. “Computers aren’t neutral.”
There are documented instances of A.I. image generators producing distorted or stereotypical images of Black people when directed to create an image of a “Black” or “African American” person. The technology has created images depicting Black people with lighter skin tones or non-Black hair.
In July, Bloomberg analyzed more than 5,000 images generated by Stability AI’s Stable Diffusion and revealed that the text-to-image model amplified stereotypes about race and gender. It portrayed individuals with lighter skin tones as having high-paying jobs and people with darker skin tones having occupations such as dishwashers, janitors and housekeepers.
Google disabled it’s A.I. program’s ability to let people search for monkeys and gorillas through its Photos app eight years ago because the algorithm was incorrectly putting Black people in those categories.
A.I. developers have said they are addressing the issue of biases, but Dixon, 53, who has worked in tech since 1997, believes the problem will persist unless more people of color participate in constructing the systems A.I. technology is built upon.
“When car airbags were first released, they killed more women than saved women because nobody tested them on crash dummies that were the size of women,” she said. “There is similar concern about A.I. If you are only building models with a certain subset of the demographic, then you are leaving whole groups out.”
Gov. Gavin Newsom signed an executive order on Sept. 6 to examine the use, development, and risks of A.I. in the state and to shape a process for deployment and evaluation of the technology.
Newsom called A.I. “transformative technology” and noted that the government sees the good and bad of A.I.
“We’re taking a clear-eyed, humble approach to this world-changing technology,” he said.
Dr. Brandeis Marshall, a data scientist and professor at Atlanta’s Spelman College, said Black women in technology have skills equal to or better than their counterparts, so more should be involved in the construction of A.I. systems. However, they do not get the same opportunities.
“I meet plenty of Black women who have all the chops, but they haven’t been promoted,” she said. “You tend to be the only one in the room.
Black Women in A.I. founder, Angle Bush of Houston, said Black women can contribute much to A.I.
“We have had to be innovative,” she said. “If we don’t have something, we figure out a way to create it. There are a lot of ideas that haven’t come to fruition because of lack of access and opportunity. It has nothing to do with our aptitude.”
Mbega believes the technology can be groundbreaking in health care and help identify ailments such as brain cancer.
Marshall said any discussions of A.I. systems taking over the world like in a Hollywood blockbuster are overblown.
“Right now, we get inundated with all the cool things,” she said. “Then, we seem surprised that there are harmful things. Let’s get a 360-degree view before we put all of our chips in one basket.”
Activism
California Rideshare Drivers and Supporters Step Up Push to Unionize
Today in California, over 600,000 rideshare drivers want the ability to form or join unions for the sole purpose of collective bargaining or other mutual aid and protection. It’s a right, and recently at the State Capitol, a large number of people, including some rideshare drivers and others working in the gig economy, reaffirmed that they want to exercise it.

By Antonio Ray Harvey
California Black Media
On July 5, 1935, President Franklin D. Roosevelt signed into federal law the National Labor Relations Act (NLRA). Also known as the “Wagner Act,” the law paved the way for employees to have “the right to self-organization, to form, join, or assist labor organizations,” and “to bargain collectively through representatives of their own choosing, according to the legislation’s language.
Today in California, over 600,000 rideshare drivers want the ability to form or join unions for the sole purpose of collective bargaining or other mutual aid and protection. It’s a right, and recently at the State Capitol, a large number of people, including some rideshare drivers and others working in the gig economy, reaffirmed that they want to exercise it.
On April 8, the rideshare drivers held a rally with lawmakers to garner support for Assembly Bill (AB) 1340, the “Transportation Network Company Drivers (TNC) Labor Relations Act.”
Authored by Assemblymembers Buffy Wicks (D-Oakland) and Marc Berman (D-Menlo Park), AB 1340 would allow drivers to create a union and negotiate contracts with industry leaders like Uber and Lyft.
“All work has dignity, and every worker deserves a voice — especially in these uncertain times,” Wicks said at the rally. “AB 1340 empowers drivers with the choice to join a union and negotiate for better wages, benefits, and protections. When workers stand together, they are one of the most powerful forces for justice in California.”
Wicks and Berman were joined by three members of the California Legislative Black Caucus (CLBC): Assemblymembers Tina McKinnor (D-Inglewood), Sade Elhawary (D-Los Angeles), and Isaac Bryan (D-Ladera Heights).
Yvonne Wheeler, president of the Los Angeles County Federation of Labor; April Verrett, President of Service Employees International Union (SEIU); Tia Orr, Executive Director of SEIU; and a host of others participated in the demonstration on the grounds of the state capitol.
“This is not a gig. This is your life. This is your job,” Bryan said at the rally. “When we organize and fight for our collective needs, it pulls from the people who have so much that they don’t know what to do with it and puts it in the hands of people who are struggling every single day.”
Existing law, the “Protect App-Based Drivers and Services Act,” created by Proposition (Prop) 22, a ballot initiative, categorizes app-based drivers for companies such as Uber and Lyft as independent contractors.
Prop 22 was approved by voters in the November 2020 statewide general election. Since then, Prop 22 has been in court facing challenges from groups trying to overturn it.
However, last July, Prop 22 was upheld by the California Supreme Court last July.
In a 2024, statement after the ruling, Lyft stated that 80% of the rideshare drivers they surveyed acknowledged that Prop 22 “was good for them” and “median hourly earnings of drivers on the Lyft platform in California were 22% higher in 2023 than in 2019.”
Wicks and Berman crafted AB 1340 to circumvent Prop 22.
“With AB 1340, we are putting power in the hands of hundreds of thousands of workers to raise the bar in their industry and create a model for an equitable and innovative partnership in the tech sector,” Berman said.
Activism
Newsom Fights Back as AmeriCorps Shutdown Threatens Vital Services in Black Communities
“When wildfires devastated L.A. earlier this year, it was AmeriCorps members out there helping families recover,” Gov. Newsom said when he announced the lawsuit on April 17. “And now the federal government wants to pull the plug? We’re not having it.”

By Bo Tefu
California Black Media
Gov. Gavin Newsom is suing the federal government over its decision to dismantle AmeriCorps, a move that puts essential frontline services in Black and Brown communities across California at risk, the Governor’s office said.
From tutoring students and mentoring foster youth to disaster recovery and community rebuilding, AmeriCorps has been a backbone of support for many communities across California.
“When wildfires devastated L.A. earlier this year, it was AmeriCorps members out there helping families recover,” Newsom said when he announced the lawsuit on April 17. “And now the federal government wants to pull the plug? We’re not having it.”
The Department of Government Efficiency (DOGE) under the Trump administration is behind the rollback, which Newsom calls “a middle finger to volunteers.”
Meanwhile, Newsom’s office announced that the state is expanding the California Service Corps, the nation’s largest state-run service program.
AmeriCorps has provided pathways for thousands of young people to gain job experience, give back, and uplift underserved neighborhoods. Last year alone, over 6,000 members across the state logged 4.4 million hours, tutoring more than 73,000 students, planting trees, supporting foster youth, and helping fire-impacted families.
The California Service Corps includes four paid branches: the #CaliforniansForAll College Corps, Youth Service Corps, California Climate Action Corps, and AmeriCorps California. Together, they’re larger than the Peace Corps and are working on everything from academic recovery to climate justice.
“DOGE’s actions aren’t about making government work better. They are about making communities weaker,” said GO-Serve Director Josh Fryday.
“These actions will dismantle vital lifelines in communities across California. AmeriCorps members are out in the field teaching children to read, supporting seniors and helping families recover after disasters. AmeriCorps is not bureaucracy; it’s boots on the ground,” he said.
Activism
AI Is Reshaping Black Healthcare: Promise, Peril, and the Push for Improved Results in California
Black Californians experience some of the worst health outcomes in the state due to systemic inequities, limited healthcare access, and exclusion from medical research. 16.7% of Black adults report fair or poor health, versus 11.5% of Whites. Black adults have the highest death rates from prostate, breast, colorectal, and lung cancer. Statewide, diabetes affects 13.6% of Black adults versus 9.1% of Whites, and 27% of Black adults over 65 have heart disease, compared to 22% of Whites. Life expectancy for Black Californians is about five years shorter than the state average.

Joe W. Bowers Jr.
California Black Media
Artificial intelligence (AI) is changing how Californians receive medical care – diagnosing diseases, predicting patient needs, streamlining treatments, and even generating medical notes for doctors.
While AI holds promise, it also poses risks, particularly for Black patients. It can provide faster diagnoses and expand access to care, but it may also misdiagnose conditions, delay treatment, or overlook patient’s critical needs. AI’s impact on Black patients depends on how biases in medical data and algorithms are addressed in its development.
“As we progress toward a society with increased use of AI technology, it is critical that the biases and stereotypes that Black Americans have faced are not perpetuated in our future innovations,” said Dr. Akilah Weber Pierson (D – San Diego), a physician and state senator spearheading legislative efforts to address AI bias in healthcare.
Why AI Matters for Black Californians
Black Californians experience some of the worst health outcomes in the state due to systemic inequities, limited healthcare access, and exclusion from medical research. 16.7% of Black adults report fair or poor health, versus 11.5% of Whites. Black adults have the highest death rates from prostate, breast, colorectal, and lung cancer. Statewide, diabetes affects 13.6% of Black adults versus 9.1% of Whites, and 27% of Black adults over 65 have heart disease, compared to 22% of Whites. Life expectancy for Black Californians is about five years shorter than the state average.
Benefits and Risks of AI in Healthcare
AI processes vast amounts of medical data using computer algorithms designed to identify patient health patterns, helping doctors to diagnose diseases, recommend treatment, and increase patient care efficiency. By analyzing scans, lab results, and patient history, AI can detect diseases
earlier, giving it the potential to improve care for Black patients, who face higher risks of prostate cancer, diabetes, heart disease and hypertension.
Dr. Judy Gichoya, an Interventional radiologist at the Emory University Winship Cancer Institute and AI researcher at Emory’s Healthcare AI Innovation and Translational Informatics (HITI) Lab, sees AI as a tool with great potential but cautions that its effectiveness depends on the diversity of the data it is trained on. She says, “Without diverse datasets, AI could overlook critical signs of diseases, especially in underrepresented populations like Black patients.”
Dr. Timnit Gebru, a computer scientist and AI ethics expert, is the founder and Executive Director of DAIR (Distributed AI Research Institute) in Oakland. She has extensively studied bias in AI systems and their impact on marginalized groups.
Gebru acknowledges that AI has the potential to improve healthcare by enhancing efficiency and expanding access to medical resources. But, like Gichoya she strongly stresses that for AI to be effective and equitable it needs to be subject to rigorous oversight.
AI is already helping doctors personalize cancer treatment by identifying biomarkers and genetic mutations. UCSF and Stanford Health use AI to analyze tumor DNA to match patients with the most effective chemotherapy or immunotherapy.
In diabetes care, AI predicts blood sugar fluctuations, helping doctors adjust treatment. It helps radiologists in early disease detection and identifies sepsis sooner, reducing hospital deaths. In cardiology, AI detects early signs of heart disease, spotting plaque buildup or abnormal heart rhythms before symptoms appear. It also helps predict strokes by analyzing brain scans to determine risk and guide intervention.
Kaiser Permanente uses AI scribes to reduce paperwork and improve patient interactions. Covered California has partnered with Google
Cloud to use AI to streamline document verification and eligibility decisions.
Despite these advancements, AI systems trained on biased medical data can perpetuate inequities for Black patients.
Gebru explains, “If AI learns from historically discriminatory medical decisions—such as undertreating Black patients—it will scale those biases.”
A notable example is in dermatology, where AI frequently misdiagnoses conditions in Black patients because most training datasets are based on lighter-skinned individuals. “Melanoma looks very different on darker skin,” Gebru notes. “It’s not just darker—it often appears differently, like under toenails, a pattern AI trained mostly on lighter skin won’t detect.”
Another risk of AI in healthcare is automation bias, where healthcare providers over-rely on AI, even when it contradicts medical expertise. “Doctors who would have prescribed medications accurately without AI sometimes make mistakes while using automated tools because they over-trust these systems,” Gebru adds.
AI-driven health insurance claim denials are a growing concern. UnitedHealthcare faces a class-action lawsuit for allegedly using an unregulated AI algorithm to deny rehabilitation coverage to elderly and disabled patients.
Beyond bias, AI also poses an environmental threat. AI systems require enormous amounts of energy for computing and massive amounts of water to cool data centers, which exacerbates climate change, an issue that already disproportionately impacts Black communities.
Trump Administration and DEI Impact
The Trump administration’s efforts to dismantle Diversity, Equity, and Inclusion (DEI) threatens funding for AI bias research in healthcare.
Less federal support could stall progress in making AI systems fairer and more accurate, increasing discrimination risks for Black patients.
California’s Legislative and Regulatory Response
Recognizing AI’s risks in healthcare, California lawmakers and state officials are implementing regulations. Weber Pierson introduced Senate Bill (SB) 503 to ensure that AI algorithms used in healthcare are tested for racial bias before implementation.
“We’ve already seen how biased medical devices like pulse oximeters can fail Black patients,” Weber Pierson explains. “If algorithms used in patient care aren’t inclusive, they’re not going to accurately serve melanated individuals.”
At a press conference introducing SB 503, Weber Pierson stressed that AI must be held accountable. “This bill focuses on ensuring that software used as an accessory to healthcare staff delivers sound, nondiscriminatory decisions that promote equitable outcomes.”
Other legislative efforts include Senate Bill (SB) 1120, by Sen. Josh Becker (D-Menlo Park), which stops insurance companies from using AI alone to deny or delay care and Assembly Bill (AB) 3030, by Assemblymember Lisa Calderon (D-Whittier), which requires healthcare providers to inform patients when AI is used in their care.
Attorney General Rob Bonta has issued a legal advisory barring AI from unfairly denying healthcare claims, falsifying records, or restricting access to care based on medical history. Gov. Gavin Newsom’s 2023 executive order directs state agencies to assess AI’s impact and establish consumer protections, particularly in healthcare.
Actions Black Patients and Families Can Take
As AI becomes more common in healthcare, Black Californians can ensure fair treatment by asking if AI is used, seeking second opinions, and supporting groups addressing algorithmic bias.
They can:
- Ask their healthcare providers whether AI played a role in their diagnosis or treatment.
- Request second opinions if an AI-generated diagnosis seems questionable.
- Advocate for AI policies and legislation promoting fairness and accountability. · Engage with community health organizations like the California Black Health Network (CBHN) that is engaged in ensuring AI is developed in ways to improve health outcomes for Black patients.
Rhonda Smith, CBHN’s executive director, says bias in medical algorithms must be eliminated. “There should never be any race-based adjustment in delivering patient care,” she said.
CBHN supports inclusive research and legislation like SB 503 to ensure AI promotes equity.
Ensuring AI Benefits All Communities
As a legislator, Weber Pierson is pushing for stronger safeguards to ensure AI serves all patients equitably. She says, “Innovation and technology are good, but new challenges arise if we don’t move in a direction inclusive and thoughtful of all people who utilize the healthcare space.”
AI has the potential to revolutionize healthcare, but experts warn it must be developed and regulated with transparency, accountability, and fairness – ensuring it reduces rather than worsens, racial health disparities.
-
Activism4 weeks ago
Oakland Post Endorses Barbara Lee
-
Activism3 weeks ago
Oakland Post: Week of April 2 – 8, 2025
-
#NNPA BlackPress3 weeks ago
Trump Profits, Black America Pays the Price
-
Activism2 weeks ago
Oakland Post: Week of April 9 – 15, 2025
-
#NNPA BlackPress3 weeks ago
Harriet Tubman Scrubbed; DEI Dismantled
-
#NNPA BlackPress3 weeks ago
Trump Targets a Slavery Removal from the National Museum of African-American History and Culture
-
#NNPA BlackPress3 weeks ago
Lawmakers Greenlight Reparations Study for Descendants of Enslaved Marylanders
-
#NNPA BlackPress3 weeks ago
New York Stands Firm Against Trump Administration’s Order to Abandon Diversity in Schools