Tagged: content moderation Toggle Comment Threads | Keyboard Shortcuts

  • Geebo 9:00 am on March 9, 2021 Permalink | Reply
    Tags: content moderation, , , , , ,   

    Why is Craigslist failing? 

    Why is Craigslist failing?

    By Greg Collier

    Since Craigslist is a privately owned company, they do not have to disclose their finances. However, the AIM Group recently asserted that Craigslist has lost close to 50% of their revenue in just two years. The AIM Group is a sort of watchdog organization that keeps tabs on the online marketplace space. They once famously referred to Craigslist as a cesspool of crime.

    Using what the AIM Group calls their proprietary methodology, Craigslist’s revenue dropped from $1 billion in 2018 to $565 million in 2020. Again, that’s an almost 50% drop in just two years. Part of the drop can absolutely be attributed to the COVID-19 pandemic, but the decline started long before lockdown. Part of the drop can also be attributed to the number of competitors that have recently started occupying the marketplace space. Other niche sites like Airbnb have also taken a chunk out of Craigslist’s userbase. However, we think it’s because of poor business decisions Craigslist has been making for the past 20 years.

    For a large majority of Craigslist’s history, it was long rumored that they received the majority of their traffic from their erotic services section. Due to mounting legal pressure over human trafficking concerns, erotic/adult services was shuttered in 2010. Craigslist’s revenues took a slight dip in 2011 but continued to climb until 2018. So, what happened in 2018 to cause such a downward spiral? That’s when Craigslist shuttered their personals section over fears of the anti-sex trafficking laws FOSTA/SESTA. After Craigslist closed their erotic services section, traffickers would instead just post their ads selling women and girls to the personals section. By closing the personals after FOSTA/SESTA was signed into law, Craigslist virtually admitted that their platform had a sex trafficking problem.

    Craigslist’s problem is that in their 26 years, they’ve refused to moderate any section of their site to keep out criminals and scammers. Craigslist only seems to moderate content on their platform when threatened with legal action. Except moderation costs money and Craigslist has a reputation of maximizing profits above all else, even at the expense of the safety of their userbase. Geebo.com and several other platforms moderate their content and still manage to be profitable. The only security measures Craigslist has is a list of safety tips and unless they change their tune, they will continue to decline.

     
  • Geebo 11:03 am on February 12, 2019 Permalink | Reply
    Tags: content moderation, , , ,   

    Geebo introduces new feature to better protect consumers 

    Geebo introduces new feature to better protect consumers

    Since the beginning, Geebo has always had user safety in mind. Not just personal safety but financial and emotional safety as well. Many of the safety choices made by Geebo have gone against what many would consider industry standards. However, we’ve always stood by those choices and have challenged other online marketplaces to do the same. For example, other classifieds sites rely on users to flag potentially fraudulent ads. This has led to abuses of the flagging system on other sites. Instead, Geebo employs a trained staff to moderate each ad for potentially fraudulent or illegal activity and the innovation doesn’t stop there.

    In 2010, Geebo took a stand against sites like Craigslist and Backpage by engaging in an anti-human trafficking campaign designed to bring awareness to the plight of victims trafficked through Geebo’s competitors. In that same year, Geebo closed its personal ads section due to the amount of trafficking that took place in the personals section of other sites. Even though there were no reported incidents about Geebo’s personals Geebo felt the removal of the section was necessary to further ensure user safety.

    In 2011, Geebo CEO Greg Collier wrote an open letter to other online classifieds asking them to take user safety more seriously by implementing such measures as moderating ads and removing adult-oriented ads. Many of those challenges were largely unheeded by other classifieds sites until media and government pressure forced them to remove their adult sections and their other ads are still largely unmoderated.

    In January 2013, Geebo made the decision to stop accepting ads for pets. In a company blog post, CEO Greg Collier noted puppy mills that sell abused or sick animals commonly use online classifieds.

    In May 2015, Geebo partnered with the AIM Group’s SafeTrade Station initiative in order to provide a list of safe trading spots at police stations across the country. Each Geebo ad contains a link to the SafeTrade Stations website so users can find a safe location to make their transactions.

    In 2016, in response to the Orlando nightclub shooting and other mass shootings, Geebo stopped accepting ads for firearms even though no firearms-related crimes were ever linked to Geebo.

    That brings us to Geebo’s latest innovation for user safety. Since late 2018, Geebo staff have been monitoring the responses to ads made through the Geebo platform. By doing this we can determine if the responses are coming from overseas which largely indicates that the ad’s respondent is more than likely a scammer. It also allows us to detect potential fraud from inside the country since our staff is trained in the detection of the most commonly used scams. We believe this proactive stance against scammers will go a long way in protecting the safety of our customers. While other sites and apps in our industry may see this as going above and beyond the call of duty we believe it’s the most logical and needed step in consumer protection.

     
  • Geebo 9:03 am on July 17, 2018 Permalink | Reply
    Tags: Channel 4, content moderation, , Inside Facebook: Secrets of the Social Network,   

    Documentary: Facebook needs controversy to survive 

    Documentary: Facebook needs controversy to survive

    Have you ever seen something posted on Facebook that was so offensive that you actually took the time to complain to Facebook? I did once. I saw a post accusing a man of a horrible crime even though there was no tangible evidence to support the claim. That was three years ago and this particular post has since been shared millions of times as if it was fact. For all I know, this man’s life could have been ruined due to false accusations. The response I got from Facebook on multiple occasions on why the post wasn’t deleted was because it did not violate their nebulous ‘community standards’. Now, a soon to be released documentary claims this is par for the course when it comes to Facebook moderation.

    British TV broadcaster Channel 4 had a journalist go undercover in a firm that is contracted to moderate Facebook content. The documentary entitled “Inside Facebook: Secrets of the Social Network.” claims that Facebook allows controversial content like this to proliferate on its network because it keeps people more engrossed in Facebook’s walled garden. This, in turn, is said to increase Facebook’s revenue through advertisements. That makes it sound a lot like Facebook is profiting from the suffering of others since most of the controversial material that isn’t deleted consists of instances of child abuse according to Business Insider.

    In a world where discourse is becoming increasingly toxic, Facebook appears to be throwing gasoline on the fire while making money by selling pitchforks and torches. Facebook denies these claims but the evidence seems to indicate the contrary. However, as usual, the problem could be solved if we all did one thing. That is for us to start using social media more responsibly and not sharing every little thing that causes us the slightest bit of outrage. It’s time for us to start using social media with a more discerning eye.

     
  • Geebo 9:07 am on October 9, 2017 Permalink | Reply
    Tags: content moderation, ,   

    Facebook to manually review ads, so why don’t others? 

    Facebook to manually review ads, so why don't others?

    Facebook has come under fire recently for allegedly accepting money for ads from a Russian entity known as the Internet Research Agency. For two years these ads ran which intended to fuel the fires of rampant political discord already troubling our country. Some of the ads could have even been viewed as racist or anti-Semitic. After turning over records of these ads to Congress, Facebook announced they would be hiring 1000 people to manually review certain ads targeted toward religious, ethnic, and social groups.

    However, this blog post ultimately is not about Facebook, but another website that touts itself as being socially responsible. We’re of course referring to craigslist. From its iconic purple peace sign logo to the numerous charitable foundations craigslist founder Craig Newmark has donated to, craigslist appears on the surface to be this socially conscious entity, yet they still do nothing to try to protect their own users.

    Craigslist ads remain largely unmoderated which has led to a vast number of scams and violent crimes. Their rants & raves section is filled all sorts of vitriol and hate from blatant racism to calls for violence. Their casual encounters section is often the playground of child predators looking for their next victim. Yet craigslist does not hire any moderators, refusing to expand from their alleged two dozen employees.

    While craigslist may not be as lucrative as Facebook, I think they could probably scrounge enough to money to hire a team of moderators. They just choose not to.

     
  • Geebo 9:01 am on July 5, 2017 Permalink | Reply
    Tags: content moderation, , , Slate   

    Craigslist has nothing to teach Facebook 

    Craigslist has nothing to teach Facebook

    Noted news and opinion website Slate recently published an article entitled “What Facebook Can Learn From Craigslist”. One could assume by the headline that Slate must mean craigslist can teach Facebook something about Facebook Marketplace, but that’s not the point Slate is trying to make. Instead, Slate makes the questionable claim craigslist has ‘conquered’ its own content moderation, which leads to the question, what moderation?

    Granted, Facebook has had its own controversies lately with Facebook Live being used to broadcast a number of crimes and suicides, and the ever-growing problem of hate speech, however craigslist should not be held up as a shining example of how content should be moderated. In researching this post, it took me literally under a minute to see something racist posted in craigslist’s forum section. That’s not even taking into account the number of news stories that go out almost daily that contain the words ‘beware’ and ‘craigslist’.

    Let’s not forget the 115 victims that have been allegedly killed during craigslist transactions.

    If anything, craigslist could learn from Facebook. While craigslist only has 40 employees, Facebook has hired contracted content moderators to at least try to curb some of the material that goes against Facebook’s terms of service. Craigslist wouldn’t even remove their adult ads section until well after CNN’s Amber Lyon famously approached craigslist founder Craig Newmark, as pictured above, about the human trafficking that took place on craigslist.

    The only thing that craigslist can teach is how not to do things.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel