Connect with us

Business

Why shielding businesses from coronavirus liability is a bad idea

Avatar

Published

on

Distinguished University Professor & Professor of Law, Georgia State University
Timothy D. Lytton is a member of the American Association for Justice.

Georgia State University provides funding as a founding partner of The Conversation US.
View all partners
Congress may be close to a deal on another coronavirus bailout, but Senate Republican demands for liability protections for businesses remain a major obstacle.
Senate Majority Leader Mitch McConnell has long warned of an “avalanche” of lawsuits that will stymie economic recovery efforts if Congress does not grant companies sweeping immunity from civil liability for failure to adequately protect workers and customers from infection.
My research on the role of civil lawsuits in reducing foodborne illness outbreaks suggests that fears of excessive litigation are unwarranted. What’s more, the modest liability exposure that does exist is important to ensuring businesses take reasonable coronavirus precautions as they resume normal operations.
As a general matter, businesses are subject to civil liability for carelessness that causes injury to others. The law defines carelessness as a failure to exercise “reasonable care.”
In applying this standard, courts consider several factors:
If the answer to one or more of the questions is no, then a court may conclude that the business was careless and is subject to liability for damages to customers who suffered harm.
In the context of the current pandemic, I believe that reasonable care sets a clear standard for business owners. Invest in cost-effective precautions like ensuring employees wear masks and provide for social distancing. Follow the latest guidance of health officials and all health and safety regulations. Keep up with what other similar businesses are doing to prevent infection. Use common sense.
Law-abiding, thoughtful business owners – those who care about the safety of their employees and their patrons – are likely to exercise reasonable care to prevent COVID-19 transmission with or without the threat of a lawsuit.
For example, the owner of a nail salon in Georgia back in April described her plans for reopening. The salon will accept patrons by appointment only, conduct pre-screening telephone interviews for signs of illness and limit the number of people in the salon at any one time. They’ll take temperatures before allowing people to enter, require hand-washing, equip employees and patrons with masks and gloves, and sanitize all work areas between appointments.
Conscientious business owners like this have no reason to fear a lawsuit alleging they failed to take reasonable precautions.
Predictions of “frivolous” lawsuits appear to be generating unnecessary anxiety among business groups. But they shouldn’t. Personal injury lawyers representing victims work on a contingency fee basis. This means that they earn fees only when they bring cases with a strong enough chance of winning to reach a favorable settlement or a judgment.
Lawyers have no incentive to bring sure losers, and they risk being disciplined for professional misconduct if they do so. For these reasons, frivolous lawsuits are rare and highly unlikely in the context of COVID-19 transmission claims against businesses.
The best available data does not support dire warnings about excessive litigation. As of Dec. 7, 6,571 civil lawsuits have been filed related to COVID-19. Only 37 of these are personal injury claims by business patrons for COVID-19 exposure, and an additional 116 are claims by employees against companies for inadequate protection from infection in the workplace, personal injury or wrongful death.
Most of the claims involved other issues, such as 1,372 insurance disputes over business losses and 1,184 claims for alleged civil rights violations.
If there is any reason to fear excessive litigation, these numbers suggest that the real threat is from lawsuits filed by business owners against their insurance companies and individuals protesting public health measures designed to prevent another economic shutdown – not from personal injury claims.
Even for business owners who fail to take reasonable precautions, the prospect of a personal injury claim is still remote.
To successfully sue a business for COVID-19 transmission, a patron would have to prove that he or she contracted COVID-19 from the business and not from some other source. However, most people infected with COVID-19 currently have no reliable way of identifying the source of their infection. The gap of three to 11 days between infection and illness, the difficulty of recalling all of one’s contacts during that interval and limited testing for the virus present formidable obstacles to establishing causation.
Moreover, a business would not be liable to patrons who knowingly and voluntarily assumed the risk of infection. Patrons of crowded stores or businesses where many customers and employees are not wearing masks, for example, would not have viable legal claims even if they can prove carelessness and causation.
As for claims by employees against careless businesses, most of these will be covered by workers’ compensation, which precludes employees from filing negligence claims for workplace injuries.
[The Conversation’s science, health and technology editors pick their favorite stories. Weekly on Wednesdays.]
Because of these considerable challenges, viable legal claims related to COVID-19 are likely to be extremely rare.
Yet even a small number of personal injury lawsuits act as a nudge, encouraging the entire business community to adopt reasonable precautions. This is one of the lessons of civil litigation arising out of foodborne illness outbreaks.
As I document in my 2019 book, “Outbreak: Foodborne Illness and the Struggle for Food Safety,” a handful of high-profile lawsuits against food companies have encouraged businesses at every link along the supply chain to improve their safety practices. That’s what happened after lawsuits against Jack in the Box over contaminated hamburgers in 1993 and Dole over E. coli in baby spinach in 2006.
Similarly, the prospect of liability for COVID-19 transmission is likely to encourage business owners to invest in cost-effective precautions, follow the advice of public health authorities, adopt industry safety standards and use common sense.
I believe shielding business owners from this liability is one kind of immunity that will not help end the current crisis.
This is an updated version of an article most recently published on Sept. 8, 2020.
Write an article and join a growing community of more than 118,900 academics and researchers from 3,823 institutions.
Register now
Copyright © 2010–2021, The Conversation US, Inc.

source

Business

Yes, customers do like it when waiters and hairdressers wear a mask – especially if it’s black

Avatar

Published

on

Professor of Hospitality and Tourism, University of South Florida
Assistant Professor of Hospitality and Tourism, University of South Florida
Financial Analyst and Researcher, University of South Florida
Ph.D. Student in Hospitality Management, Auburn University
The authors do not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.

University of South Florida provides funding as a founding partner of The Conversation US.
View all partners
The Research Brief is a short take about interesting academic work.
Customers perceive a better quality of service, feel less anxious and exhibit more trust in businesses when waiters and other service workers wear a mask, according to a new study we just submitted for peer review. And we found this to be especially true when the mask was black.
We surveyed about 4,500 Americans through Amazon’s Mechanical Turk, showing each of them a random picture of a service employee, with or without a mask, in either a grocery store, bank, hair salon, hotel or restaurant. We included photos of men and women who were either Black, white or Asian. Further, the masks were in one of five color schemes: white, black, blue, multi-colored or clear.
We then asked participants to record their impressions of the service workers and subsequent perceptions, emotions and behavior.
We found that customers consistently expected higher quality of service from workers who wore masks compared to employees who weren’t wearing face coverings. We also found that participants tended to become less anxious when they saw a service person with a mask.
Interestingly, we found that the color of the mask worn made a difference. People who wore black masks got the highest ratings, followed by white, multi-colored and blue. The clear mask – even though it allows customers to actually see facial expressions – was rated the lowest by respondents.
While we didn’t ask participants about their political leanings, we did learn where they reside. And we found that those based in the West had the most positive reaction to mask-wearing, followed, in order, by people in the Southwest, Northeast, Southeast and Midwest.
We found no meaningful differences in terms of the respondents’ age, race or educational level.
President Joe Biden’s first acts as president included requiring masks be worn on all federal property and on planes and trains, and most businesses already expect their employees to wear face coverings while working. While the primary reason for this is to mitigate the spread of the coronavirus and protect workers, little is known about how mask-wearing can affect customer perceptions of service quality.
Wearing a mask in Asian cultures has been a socially accepted practice for years. In the United States, however, wearing a mask became common only last year as the pandemic worsened in the spring. The practice remains controversial among some people who claim it violates their civil liberties or isn’t actually effective, though health officials have consistently endorsed their use.
Although there have been many reports of altercations when workers asked a customer to wear a mask, our research shows most people appreciate it when waiters and hairdressers cover their own faces.
We plan to also study other effects of wearing masks. For example, are the perceptions of service staff members affected when customers and fellow employees wear a mask or not? What’s the impact on a customer if a mask carries a logo of a company or branding message? Do servers who wear masks receive higher tips?
[You’re smart and curious about the world. So are The Conversation’s authors and editors. You can read us daily by subscribing to our newsletter.]
Write an article and join a growing community of more than 120,000 academics and researchers from 3,859 institutions.
Register now
Copyright © 2010–2021, The Conversation US, Inc.

source

Continue Reading

Business

Harriet Tubman: Biden revives plan to put a Black woman of faith on the $20 bill

Avatar

Published

on

Professor and Chair of History Department, Colorado State University
Robert Gudmestad does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

Colorado State University provides funding as a member of The Conversation US.
View all partners
The Biden administration has revived a plan to put Harriet Tubman on the US$20 bill after Donald Trump’s Treasury secretary delayed the move.
That’s encouraging news to the millions of people who have expressed support for putting her face on the bill. But many still aren’t familiar with the story of Tubman’s life, which was chronicled in a 2019 film, “Harriet.”
Harriet Tubman worked as a slave, spy and eventually an abolitionist. What I find most fascinating, as a historian of American slavery, is how her belief in God helped Tubman remain fearless, even when she came face to face with many challenges.
Tubman was born Araminta Ross in 1822 on the Eastern Shore of Maryland. When interviewed later in life, Tubman said she started working as a housemaid when she was 5. She recalled that she endured whippings, starvation and hard work even before she got to her teenage years.
She labored in Maryland’s tobacco fields, but things started to change when farmers switched their main crop to wheat.
Grain required less labor, so slave owners began to sell their enslaved people to plantation owners in the Deep South.
Two of Tubman’s sisters were sold to a slave trader. One had to leave her child behind. Tubman, too, lived in fear of being sold.
When she was 22, Tubman married a free black man named John Tubman. For reasons that are unclear, she changed her name, taking her mother’s first name and her husband’s last name. Her marriage did not change her status as an enslaved person.
Five years later, rumors circulated in the slave community that slave traders were once again prowling through the Eastern Shore. Tubman decided to seize her freedom rather than face the terror of being chained with other slaves to be carried away, often referred to as the “chain gang.”
Tubman stole into the woods and, with the help of some members of the Underground Railroad, walked the 90 miles to Philadelphia, where slavery was illegal. The Underground Railroad was a loose network of African Americans and whites who helped fugitive slaves escape to a free state or to Canada. Tubman began working with William Still, an African American clerk from Philadelphia, who helped slaves find freedom.
Tubman led about a dozen rescue missions that freed about 60 to 80 people. She normally rescued people in the winter, when the long dark nights provided cover, and she often adopted some type of disguise. Even though she was the only “conductor” on rescue missions, she depended on a few houses connected with the Underground Railroad for shelter. She never lost a person escaping with her and won the nickname of Moses for leading so many people to “the promised land,” or freedom.
After the Civil War began, Tubman volunteered to serve as a spy and scout for the Union Army. She ended up in South Carolina, where she helped lead a military mission up the Combahee River. Located about halfway between Savannah, Georgia, and Charleston, South Carolina, the river was lined with a number of valuable plantations that the Union Army wanted to destroy.
Tubman helped guide three Union steamboats around Confederate mines and then helped about 750 enslaved people escape with the federal troops.
She was the only woman to lead men into combat during the Civil War. After the war, she moved to New York and was active in campaigning for equal rights for women. She died in 1913 at the age of 90.
Tubman’s Christian faith tied all of these remarkable achievements together.
She grew up during the Second Great Awakening, which was a Protestant religious revival in the United States. Preachers took the gospel of evangelical Christianity from place to place, and church membership flourished. Christians at this time believed that they needed to reform America to usher in Christ’s second coming.
A number of Black female preachers preached the message of revival and sanctification on Maryland’s Eastern Shore. Jarena Lee was the first authorized female preacher in the African Methodist Episcopal Church.
It is not clear whether Tubman attended any of Lee’s camp meetings, but she was inspired by the evangelist. She came to understand that women could hold religious authority.
Historian Kate Clifford Larson believes that Tubman drew from a variety of Christian denominations, including the African Methodist Episcopal, Baptist and Catholic beliefs. Like many enslaved people, her belief system fused Christian and African beliefs.
Her belief that there was no separation between the physical and spiritual worlds was a direct result of African religious practices. Tubman literally believed that she moved between a physical existence and a spiritual experience where she sometimes flew over the land.
An enslaved person who trusted Tubman to help him escape simply noted that Tubman had “de charm,” or God’s protection. Charms or amulets were strongly associated with African religious beliefs.
A horrific accident is believed to have brought Tubman closer to God and reinforced her Christian worldview. Sarah Bradford, a 19th-century writer who conducted interviews with Tubman and several of her associates, found the deep role faith played in her life.
When she was a teenager, Tubman happened to be at a dry goods store when an overseer was trying to capture an enslaved person who had left his slave labor camp without permission. The angry man threw a 2-pound weight at the runaway but hit Tubman instead, crushing part of her skull. For two days she lingered between life and death.
The injury almost certainly gave her temporal lobe epilepsy. As a result, she would have splitting headaches, fall asleep without notice, even during conversations, and have dreamlike trances.
[Deep knowledge, daily. Sign up for The Conversation’s newsletter.]
As Bradford documents, Tubman believed that her trances and visions were God’s revelation and evidence of his direct involvement in her life. One abolitionist told Bradford that Tubman “talked with God, and he talked with her every day of her life.”
According to Larson, this confidence in providential guidance and protection helped make Tubman fearless. Standing only 5 feet tall, she had an air of authority that demanded respect.
Once Tubman told Bradford that when she was leading two “stout” men to freedom, she believed that “God told her to stop” and leave the road. She led the scared and reluctant men through an icy stream – and to freedom.
Harriet Tubman once said that slavery was “the next thing to hell.” She helped many transcend that hell.
This is an updated version of an article originally published on Dec. 3, 2019.
Write an article and join a growing community of more than 120,000 academics and researchers from 3,857 institutions.
Register now
Copyright © 2010–2021, The Conversation US, Inc.

source

Continue Reading

Business

Trump revived Andrew Jackson’s spoils system, which would undo America’s 138-year-old professional civil service

Avatar

Published

on

Professor of Business Administration and of Public and International Affairs, University of Pittsburgh
Barry M. Mitnick does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

University of Pittsburgh provides funding as a member of The Conversation US.
View all partners
The federal government’s core civilian workforce has long been known for its professionalism. About 2.1 million nonpartisan career officials provide essential public services in such diverse areas as agriculture, national parks, defense, homeland security, environmental protection and veterans affairs.
To get the vast majority of these “competitive service” jobs – which are protected from easy firing – federal employees must demonstrate achievement in job-specific knowledge, skills and abilities superior to other applicants and, in some cases, pass an exam. In other words, the civil service is designed to be “merit-based.”
It wasn’t always so.
From Andrew Jackson until Theodore Roosevelt, much of the federal workforce was subject to change after every presidential election – and often did. Known as the spoils system, this pattern of political patronage, in which officeholders award allies with jobs in return for support, began to end in the late 19th century as citizens and politicians like Roosevelt grew fed up with its corruption, incompetence and inefficiency – and its role in the assassination of a president.
Less than two weeks before Election Day, Donald Trump signed an executive order that threatens to return the U.S. to a spoils system in which a large share of the federal government’s workforce could be fired for little or no reason – including a perceived lack of loyalty to the president.
While President Joe Biden appears likely to reverse the order, its effects may not be so easily undone. And he may have his own reasons for keeping it temporarily in place.
The government of the early republic was small, but the issue of whether civil servants should be chosen on the basis of patronage or skills was hotly debated.
Although George Washington and the five presidents who followed him certainly employed patronage, they emphasized merit when making appointments.
Washington wrote that relying on one’s personal relationship to the applicant would constitute “an absolute bar to preferment” and wanted those “as in my judgment shall be the best qualified to discharge the functions of the departments to which they shall be appointed.” He would not even appoint his own soldiers to government positions if they lacked the necessary qualifications.
That changed in 1829 when Andrew Jackson, the seventh president, entered the White House.
Jackson came to office as a reformer with a promise to end the dominance of elites and what he considered their corrupt policies. He believed that popular access to government jobs – and their frequent turnover through a four-year “rotation in office” – could serve ideals of democratic participation, regardless of one’s qualifications for a position.
As a result, at his inaugural reception on March 4, a huge crowd of office seekers crashed the reception. Jackson was “besieged by applicants” and “battalions of hopefuls,” all seeking government jobs.
Instead of preventing corruption from taking root, Jackson’s rotation policy became an opportunity for patronage – or rewarding supporters with the spoils of victory. He defended the practice by declaring: “If my personal friends are qualified and patriotic, why should I not be permitted to bestow a few offices on them?”
Besides possessing a lack of appropriate skills and commitment, office seekers were expected to pay “assessments” – a percentage of their salary ranging from 2% to 7% – to the party that appointed them.
Although Jackson replaced only about 10% of the federal workforce and 41% of presidential appointments, the practice increasingly became the norm as subsequent presidents fired as well as refused to reappoint ever-larger shares of the government.

The peak of the spoils system came under James Buchanan, who served from 1857 to 1861. He replaced virtually every federal worker at the end of their “rotation.” William L. Marcy, who was secretary of state under Buchanan’s predecessor and was the first to refer to patronage as “spoils,” wrote in 1857 that civil servants from his administration were being “hunted down like wild beasts.”
Even Abraham Lincoln, who followed Buchanan, made extensive use of the system, replacing at least 1,457 of the 1,639 officials then subject to presidential appointment. The number would have been higher but for the secession of Southern states, which put some federal officials out of his reach.
The tide began to turn in the late 1860s following public revelations that positions had been created requiring little or no work and other abuses, including illiterate appointees, and a congressional report about the success of civil service systems in Great Britain, China, France and Prussia.
In 1870, President Ulysses S. Grant asked Congress to take action, complaining, “The present system does not secure the best men, and often not even fit men, for public place.” Congress responded with legislation that authorized the president to use executive orders to prescribe regulations for the civil service. That power exists today, most recently exercised in Trump’s own order.
Grant established a Civil Service Commission that led to some reforms, but just two years later a hostile Congress cut off new funding, and Grant terminated the experiment in March 1875. The number of jobs potentially open to patronage continued to soar, doubling from 51,020 in 1871 to 100,020 in 1881.
But across the U.S., citizens were becoming disgusted by a government stuffed with the people known as “spoilsmen,” leading to a growing reform movement. The assassination of President James Garfield in 1881 by a deranged office seeker who felt Garfield had denied him the Paris diplomatic post he wanted pushed the movement over the edge.
Garfield’s murder was widely blamed on the spoils system. George William Curtis, editor of Harper’s Weekly and an advocate for reform, published cartoons lambasting the system and called it “a vast public evil.”
In early 1883, immediately after an election that led to sweeping gains for politicians in favor of reform, Congress passed the Pendleton Act. It created the Civil Service System of merit-based selection and promotion. The act banned “assessments,” implemented competitive exams and open competitions for jobs, and prevented civil servants from being fired for political reasons.
Roosevelt was appointed to the new commission that oversaw the system by President Benjamin Harrison in 1889 and quickly became its driving force – even as Harrison himself abused the spoils system, replacing 43,823 out of 58,623 postmasters, for example.
At first, the system covered just 10.5% of the federal workforce, but it was gradually expanded to cover most workers. Under Roosevelt, who became president in 1901 after William McKinley was assassinated, the number of covered employees finally exceeded those not covered in 1904 and soon reached almost two-thirds of all federal jobs. At its peak in the 1950s, the competitive civil service covered almost 90% of federal employees.
New York, where Roosevelt was an assemblyman, and Massachusetts were the first states to implement their own civil service systems. Although all states now have such systems in place at local, state or both levels, it was not until after 1940 that most states adopted a competitive civil service.
Trump’s executive order would mark a significant change.
The Oct. 21 order created a new category of the civil service workforce, known as “Schedule F,” which would include all currently protected employees in career positions that have a “confidential, policy-determining, policy-making or policy-advocating character.” Because the language is both vague and encompassing, it may apply to as many as hundreds of thousands of the 2.1 million federal civilian workers – potentially to every worker who has any discretion in giving advice or making decisions.
The first agency to report a list of covered workers, the Office of Management and Budget, identified 425 professionals – 88% of its employees – as transferable to Schedule F, which means they could be fired at will.
Although the order didn’t formally take effect until Jan. 19, some agencies had already taken actions consistent with it – including an apparent “purge” of career employees deemed insufficiently loyal to Trump. But the Trump administration was unable to fully implement Schedule F before Biden took over on Jan. 20.
Of course, Biden could quickly reverse the order – and there’s already a bipartisan push to forbid these transfers – but rehiring anyone who has been fired won’t be easy or immediate.
Furthermore, Trump had tried to “burrow” political appointees deep into the senior executive service, the top level of the civil service. The burrowing included the controversial appointment of Michael Ellis as general counsel of the National Security Agency. Senior executive service rules permit some political appointees to be converted to civil servants. This could protect them from easily being removed by Biden.
[Deep knowledge, daily. Sign up for The Conversation’s newsletter.]
Biden may want to remove civil servants considered Trump loyalists who may try to subvert his policies. If so, he’ll have to keep the executive order in place to expedite the process and convert those employees to the new Schedule F classification, which would allow him to remove them. But keeping and using Schedule F, even for a relatively brief period, challenges the most fundamental principles of the civil service.
Trump’s order and Biden’s dilemma show that Teddy Roosevelt’s work is still unfinished. If, on a whim, a president can undo over a century of reforms, then the civil service remains insufficiently insulated from politics and patronage. It may be time Congress passed a new law that permanently shields one of America’s proudest achievements from becoming another dysfunctional part of the U.S. government.
Write an article and join a growing community of more than 119,900 academics and researchers from 3,852 institutions.
Register now
Copyright © 2010–2021, The Conversation US, Inc.

source

Continue Reading
Advertisement
Advertisement