Instant Guru Review Demo Hi, this is dan green and in this short video we’re, goingRead more
RollerCoaster Tycoon 3: Complete Edition arrives on both Switch and PC on September 24. The complete edition comes with the Soaked! and Wild! expansion packs, so you’re good to go on all the bonus content to make some ripping rides.
The draw on Switch is clearly to build the ultimate coaster anywhere, anytime, but PC players get the bonus of having widescreen mode and 1080p. Take a peek at a new trailer showcasing the roaring roller coaster ride below!
Google, Google, Google
For well over a decade Google has dominated search to where most stories in the search sphere were about Google or something on the periphery.
In 2019 Google generated $134.81 billion in ad revenues.
When Verizon bought core Yahoo three years ago the final purchase price was $4.48 billion. That amount was to own their finance vertical, news vertical, web portal, homepage, email & web search. It also included a variety of other services like Tumblr.
Part of what keeps Google so dominant in search is their brand awareness. That is also augmented by distribution as defaults in Chrome and Android. Then when it comes to buying search distribution from other players like Mozilla Firefox, Opera or Apple’s Safari they can outbid everyone else as they are much better at monetizing tier 2 markets and emerging markets than other search companies are since they have such strong ad depth. Even if Bing gave a 100% revshare to Apple they still could not compete with Google in most markets in terms of search monetization.
Apple as a Huge Search Traffic Driver
In 2019 Google paid just under £1.2 billion in default payments for UK search traffic. Most of that went to Apple. Historically when Google broke out their search revenues by region typically the US was around 45% to 46% of search ad revenue & the UK was around 11% to 12%, so it is likely Google is spending north of $10 billion a year to be the default search provider on Apple devices:
Apple submitted that search engines do not pay Apple for the right to be set as the primary default search engine on its devices. However, our assessment is that Google does pay to be the primary default on Apple devices. The agreement between Google and Apple states that Google will be the default web search provider and the same agreement states that Google will pay Apple a specified share of search advertising revenues. We also note that Google does not pay compensation to any partners that set Google Search as a secondary option. This further suggests that Google’s payment to Apple is in return for Apple setting Google as the primary default.
Apple is glad to cash those checks & let Google handle the core algorithmic search function in the web browser, but Apple also auto-completes many searches from within the address bar via various features like website history, top hit, news, Siri suggested website, suggested sites, etc.
A Unique Voice in Search
The nice thing about Apple powering some of those search auto-complete results themselves is their results are not simply a re-hash of the Google search results so they can add a unique voice to the search marketplace where if your site isn’t doing as well in Google it could still be promoted by Apple based on other factors.
Apple users generally have plenty of disposable personal income and a tendency to dispose of much of it, so if you are an Android user it is probably worth having an Apple device to see what they are recommending for core terms in your client’s markets. If you want to see recommendations for a particular country you may need to have a specialized router targeted to that country or use a web proxy or VPN.
Most users likely conduct full search queries and click through to listings from the Google search result page, but over time the search autocomplete feature that recommends previously viewed websites and other sites likely picks up incremental share of voice.
A friend of mine from the UK runs a local site and the following shows how the Apple ecosystem drove nearly 2/3 of his website traffic.
His website is only a couple years old, so it doesn’t get a ton of traffic from other sources yet. As of now his site does not have great Google rankings, but even if it did the boost by the Apple recommendations still provides a tailwind of free distribution and awareness (for however long it lasts).
For topics covered in news or repeat navigational searches Apple likely sends a lot of direct visits via their URL auto-completion features, but they do not use the feature broadly into the tail of search across other verticals, so it is a limited set of searches that ultimately benefit from the shortcuts.
Apple Search Ranking Factors
Apple Search may take the following into account when ranking web search results:
- Aggregated user engagement with search results
- Relevancy and matching of search terms to webpage topics and content
- Number and quality of links from other pages on the web
- User location based signals (approximate data)
- Webpage design characteristics
I have seen some country-code TLDs do well in their local markets in spite of not necessarily being associated with large brands. Sites which do not rank well in Google can still end up in the mix provided the user experience is clean, the site is useful and it is easy for Apple to associate the site with a related keyword.
Panda-like Quality Updates
Markets like news change every day as the news changes, but I think Apple also does some Panda-like updates roughly quarterly where they do a broad refresh of what they recommend generally. As part of those updates sites which were once recommended can end up seeing the recommendation go away (especially if user experience declined since the initial recommendation via an ad heavy layout or similar) while other sites that have good engagement metrics get recommended on related searches.
A friend had a website they sort of forgot that was recommended by Apple. That site saw a big jump on July 9, 2018 then it slid back in early August that year, likely after the testing data showed it wasn’t as good as some other site Apple recommended. They noticed the spike in traffic & improved the site a bit. In early October it was widely recommended once again. That lasted until May of 2019 when it fell off a cliff once more. They had monetized the site with a somewhat spammy ad network & the recommendation mostly went away.
The recommendations happen as the person types and they may be different for searches where there is a space between keywords and the word is ran together. It is also worth noting Apple will typically recommend the www. version of a site over the m. version of a site for sites that offer both, so it makes sense to ensure if you used separate URLs that the www version also uses a responsive website design.
Indirect Impact on Google
While the Apple search shortcuts bypass Google search & thus do not create direct user signals to impact Google search, people who own an iPhone then search on a Windows computer at work or a Windows laptop at home might remember the site they liked from their iPhone and search for it once more, giving the site some awareness that could indirectly bleed over into impacting Google’s search rankings.
Apple could also eventually roll out their own fully featured search engine.
Links = Rank
Old Google (pre-Panda) was to some degree largely the following: links = rank.
Once you had enough links to a site you could literally pour content into a site like water and have the domain’s aggregate link authority help anything on that site rank well quickly.
As much as PageRank was hyped & important, having a diverse range of linking domains and keyword-focused anchor text were important.
Brand = Rank
After Vince then Panda a site’s brand awareness (or, rather, ranking signals that might best simulate it) were folded into the ability to rank well.
Panda considered factors beyond links & when it first rolled out it would clip anything on a particular domain or subdomain. Some sites like HubPages shifted their content into subdomains by users. And some aggressive spammers would rotate their entire site onto different subdomains repeatedly each time a Panda update happened. That allowed those sites to immediately recover from the first couple Panda updates, but eventually Google closed off that loophole.
Any signal which gets relied on eventually gets abused intentionally or unintentionally. And over time it leads to a “sameness” of the result set unless other signals are used:
Google is absolute garbage for searching anything related to a product. If I’m trying to learn something invariably I am required to search another source like Reddit through Google. For example, I became introduced to the concept of weighted blankets and was intrigued. So I Google “why use a weighted blanket” and “weighted blanket benefits”. Just by virtue of the word “weighted blanket” being in the search I got pages and pages of nothing but ads trying to sell them, and zero meaningful discourse on why I would use one
Getting More Granular
Over time as Google got more refined with Panda broad-based sites outside of the news vertical often fell on tough times unless they were dedicated to some specific media format or had a lot of user engagement metrics like a strong social network site. That is a big part of why the New York Times sold About.com for less than they paid for it & after IAC bought it they broke it down into a variety of sites like: Verywell (health), the Spruce (home decor), the Balance (personal finance), Lifewire (technology), Tripsavvy (travel) and ThoughtCo (education & self-improvement).
Penguin further clipped aggressive anchor text built on low quality links. When the Penguin update rolled out Google also rolled out an on-page spam classifier to further obfuscate the update. And the Penguin update was sandwiched by Panda updates on either side, making it hard for people to reverse engineer any signal out of weekly winners and losers lists from services that aggregate massive amounts of keyword rank tracking data.
So much of the link graph has been decimated that Google reversed their stance on nofollow to where in March 1st of this year they started treating it as a hint versus a directive for ranking purposes. Many mainstream media websites were overusing nofollow or not citing sources at all, so this additional layer of obfuscation on Google’s part will allow them to find more signal in that noise.
May 4, 2020 Algo Update
On May 4th Google rolled out another major core update.
Later today, we are releasing a broad core algorithm update, as we do several times per year. It is called the May 2020 Core Update. Our guidance about such updates remains as we’ve covered before. Please see this blog post for more about that:https://t.co/e5ZQUAlt0G— Google SearchLiaison (@searchliaison) May 4, 2020
I saw some sites which had their rankings suppressed for years see a big jump. But many things changed at once.
On some political search queries which were primarily classified as being news related Google is trying to limit political blowback by showing official sites and data scraped from official sites instead of putting news front & center.
“Google’s pretty much made it explicit that they’re not going to propagate news sites when it comes to election related queries and you scroll and you get a giant election widget in your phone and it shows you all the different data on the primary results and then you go down, you find Wikipedia, you find other like historical references, and before you even get to a single news article, it’s pretty crazy how Google’s changed the way that the SERP is intended.”
That change reflects the permanent change to the news media ecosystem brought on by the web.
The Internet commoditized the distribution of facts. The “news” media responded by pivoting wholesale into opinions and entertainment.— Naval (@naval) May 26, 2016
A blog post by Lily Ray from Path Interactive used Sistrix data to show many of the sites which saw high volatility were in the healthcare vertical & other your money, your life (YMYL) categories.
One of the more interesting pieces of feedback on the update was from Rank Ranger, where they looked at particular pages that jumped or fell hard on the update. They noticed sites that put ads or ad-like content front and center may have seen sharp falls on some of those big money pages which were aggressively monetized:
Seeing this all but cements the notion (in my mind at least) that Google did not want content unrelated to the main purpose of the page to appear above the fold to the exclusion of the page’s main content! Now for the second wrinkle in my theory…. A lot of the pages being swapped out for new ones did not use the above-indicated format where a series of “navigation boxes” dominated the page above the fold.
The above shift had a big impact on some sites which are worth serious money. Intuit paid over $7 billion to acquire Credit Karma, but their credit card affiliate pages recently slid hard.
Credit Karma lost 40% traffic from May core update. That’s insane, they do major TV ads and likely pay millions in SEO expenses. Think about that folks. Your site isn’t safe. Google changes what they want radically with every update, while telling us nothing!— SEOwner (@tehseowner) May 14, 2020
The above sort of shift reflects Google getting more granular with their algorithms. Early Panda was all or nothing. Then it started to have different levels of impact throughout different portions of a site.
Brand was sort of a band aid or a rising tide that lifted all (branded) boats. Now we are seeing Google get more granular with their algorithms where a strong brand might not be enough if they view the monetization as being excessive. That same focus on page layout can have a more adverse impact on small niche websites.
One of my old legacy clients had a site which was primarily monetized by the Amazon affiliate program. About a month ago Amazon chopped affiliate commissions in half & then the aggressive ad placement caused search traffic to the site to get chopped in half when rankings slid on this update.
Their site has been trending down over the past couple years largely due to neglect as it was always a small side project. They recently improved some of the content about a month or so ago and that ended up leading to a bit of a boost, but then this update came. As long as that ad placement doesn’t change the declines are likely to continue.
They just recently removed that ad unit, but that meant another drop in income as until there is another big algo update they’re likely to stay at around half search traffic. So now they have a half of a half of a half. Good thing the site did not have any full time employees or they’d be among the millions of newly unemployed. That experience though really reflects how websites can be almost like debt levered companies in terms of going under virtually overnight. Who can have revenue slide around 88% and then take increase investment in the property using the remaining 12% while they wait for the site to be rescored for a quarter year or more?
“If you have been negatively impacted by a core update, you (mostly) cannot see recovery from that until another core update. In addition, you will only see recovery if you significantly improve the site over the long-term. If you haven’t done enough to improve the site overall, you might have to wait several updates to see an increase as you keep improving the site. And since core updates are typically separated by 3-4 months, that means you might need to wait a while.”
Almost nobody can afford to do that unless the site is just a side project.
Google could choose to run major updates more frequently, allowing sites to recover more quickly, but they gain economic benefit in defunding SEO investments & adding opportunity cost to aggressive SEO strategies by ensuring ranking declines on major updates last a season or more.
Choosing a Strategy vs Letting Things Come at You
They probably should have lowered their ad density when they did those other upgrades. If they had they likely would have seen rankings at worst flat or likely up as some other competing sites fell. Instead they are rolling with a half of a half of a half on the revenue front. Glenn Gabe preaches the importance of fixing all the problems you can find rather than just fixing one or two things and hoping it is enough. If you have a site which is on the edge you sort of have to consider the trade offs between various approaches to monetization.
- monetize it lightly and hope the site does well for many years
- monetize it slightly aggressively while using the extra income to further improve the site elsewhere and ensure you have enough to get by any lean months
- aggressively monetize the shortly after a major ranking update if it was previously lightly monetized & then hope to sell it off a month or two later before the next major algorithm update clips it again
Outcomes will depend partly on timing and luck, but consciously choosing a strategy is likely to yield better returns than doing a bit of mix-n-match while having your head buried in the sand.
Reading the Algo Updates
You can spend 50 or 100 hours reading blog posts about the update and learn precisely nothing in the process if you do not know which authors are bullshitting and which authors are writing about the correct signals.
But how do you know who knows what they are talking about?
It is more than a bit tricky as the people who know the most often do not have any economic advantage in writing specifics about the update. If you primarily monetize your own websites, then the ignorance of the broader market is a big part of your competitive advantage.
Making things even trickier, the less you know the more likely Google would be to trust you with sending official messaging through you. If you syndicate their messaging without questioning it, you get a treat – more exclusives. If you question their messaging in a way that undermines their goals, you’d quickly become persona non grata – something cNet learned many years ago when they published Eric Schmidt’s address.
It would be unlikely you’d see the following sort of Tweet from say Blue Hat SEO or Fantomaster or such.
I asked Gary about E-A-T. He said it’s largely based on links and mentions on authoritative sites. i.e. if the Washington post mentions you, that’s good.
To be able to read the algorithms well you have to have some market sectors and keyword groups you know well. Passively collecting an archive of historical data makes the big changes stand out quickly.
Everyone who depends on SEO to make a living should subscribe to an online rank tracking service or run something like Serposcope locally to track at least a dozen or two dozen keywords. If you track rankings locally it makes sense to use a set of web proxies and run the queries slowly through each so you don’t get blocked.
You should track at least a diverse range to get a true sense of the algorithmic changes.
- a couple different industries
- a couple different geographic markets (or at least some local-intent vs national-intent terms within a country)
- some head, midtail and longtail keywords
- sites of different size, age & brand awareness within a particular market
Some tools make it easy to quickly add or remove graphing of anything which moved big and is in the top 50 or 100 results, which can help you quickly find outliers. And some tools also make it easy to compare their rankings over time. As updates develop you’ll often see multiple sites making big moves at the same time & if you know a lot about the keyword, the market & the sites you can get a good idea of what might have been likely to change to cause those shifts.
Once you see someone mention outliers most people miss that align with what you see in a data set, your level of confidence increases and you can spend more time trying to unravel what signals changed.
I’ve read influential industry writers mention that links were heavily discounted on this update. I have also read Tweets like this one which could potentially indicate the opposite.
If I had little to no data, I wouldn’t be able to get any signal out of that range of opinions. I’d sort of be stuck at “who knows.”
By having my own data I track I can quickly figure out which message is more inline with what I saw in my subset of data & form a more solid hypothesis.
No Single Smoking Gun
As Glenn Gabe is fond of saying, sites that tank usually have multiple major issues.
Google rolls out major updates infrequently enough that they can sandwich a couple different aspects into major updates at the same time in order to make it harder to reverse engineer updates. So it does help to read widely with an open mind and imagine what signal shifts could cause the sorts of ranking shifts you are seeing.
Sometimes site level data is more than enough to figure out what changed, but as the above Credit Karma example showed sometimes you need to get far more granular and look at page-level data to form a solid hypothesis.
As the World Changes, the Web Also Changes
About 15 years ago online dating was seen as a weird niche for recluses who perhaps typically repulsed real people in person. Now there are all sorts of niche specialty dating sites including a variety of DTF type apps. What was once weird & absurd had over time become normal.
The COVID-19 scare is going to cause lasting shifts in consumer behavior that accelerate the movement of commerce online. A decade of change will happen in a year or two across many markets.
Telemedicine will grow quickly. Facebook is adding commerce featured directly onto their platform through partnering with Shopify. Spotify is spending big money to buy exclusives rights to distribute widely followed podcasters like Joe Rogan. Uber recently offered to acquire GrubHub. Google and Apple will continue adding financing features to their mobile devices. Movie theaters have lost much of their appeal.
Tons of offline “value” businesses ended up having no value after months of revenue disappearing while large outstanding debts accumulated interest. There is a belief that some of those brands will have strong latent brand value that carries over online, but if they were weak even when the offline stores acting like interactive billboards subsidized consumer awareness of their brands then as those stores close the consumer awareness & loyalty from in-person interactions will also dry up. A shell of a company rebuilt around the Toys R’ Us brand is unlikely to beat out Amazon’s parallel offering or a company which still runs stores offline.
Big box retailers like Target & Walmart are growing their online sales at hundreds of percent year over year.
There will be waves of bankruptcies, dramatic shifts in commercial real estate prices (already reflected in plunging REIT prices), and more people working remotely (shifting residential real estate demand from the urban core back out into suburbs).
People who work remote are easier to hire and easier to fire. Those who keep leveling up their skills will eventually get rewarded while those who don’t will rotate jobs every year or two. The lack of stability will increase demand for education, though much of that incremental demand will be around new technologies and specific sectors – certificates or informal training programs instead of degrees.
More and more activities will become normal online activities.
The University of California has about a half-million students & in the fall semester they are going to try to have most of those classes happen online. How much usage data does Google gain as thousands of institutions put more and more of their infrastructure and service online?
Colleges have to convince students for the next year that a remote education is worth every bit as much as an in-person one, and then pivot back before students actually start believing it.
It’s like only being able to sell your competitor’s product for a year.— Naval (@naval) May 6, 2020
A lot of B & C level schools are going to go under as the like-vs-like comparison gets easier. Back when I ran a membership site here a college paid us to have students gain access to our membership area of the site. As online education gets normalized many unofficial trade-related sites will look more economically attractive on a relative basis.
If core institutions of the state deliver most of their services online, then other companies can be expected to follow. When big cities publish lists of crimes they will not respond to during economic downturns they are effectively subsidizing more crime. That in turn makes moving to somewhere a bit more rural & cheaper make sense, particularly when you no longer need to live near your employer.
The most important implication of this permanent WFH movement are state income taxes.
The warm, sunny states with affordable housing and zero taxes will see an influx of educated, rich workers. States will need to cut taxes to keep up.
The biggest loser in this is CA.— Chamath Palihapitiya (@chamath) May 21, 2020
To Teach, One Must Learn
One of the benefits of writing is it forces you to structure your thoughts.
If you are doing something to pass a test rote memorization can work, but if you are trying to teach someone else and care it forces you to know with certainty what you are teaching.
When I was in nuclear power school one guy was about to flunk out and I did not want to let him so I taught him stuff for days. He passed that test and as a side effect I got my highest score I ever got on one of those tests. He eventually did flunk out, but he knew other people were rooting for him and tried to help him.
Market Your Work or Become Redundant
Going forward as more work becomes remote it is going to be easier to hire and fire people. The people who are great at sharing their work and leaving a public record of it will likely be swimming in great opportunities, whereas some equally talented people who haven’t built up a bit of personal brand equity will repeatedly get fired in spite of being amazingly talented, simply because there was a turn in the economy and management is far removed from the talent. As bad as petty office politics can be, it will likely become more arbitrary when everyone is taking credit for the work of others & people are not sitting side by side to see who actually did the work.
Uber recently announced they were laying off thousands of employees while looking to move a lot of their core infrastructure work overseas where labor is cheaper. Lots of people will be made redundant as unicorn workers in a recession suddenly enjoy the job stability and all the perks of the gig working economy.
We have a great graphic designer who is deeply passionate about his work. He can hand draw amazing art or comics and is also great at understanding illustration software, web design, web usability, etc. I have no idea why he was fired from his prior employer but am thankful he was as he has been a joy to work with.
Before COVID-19 killed office work I sat right next to our lead graphic designer and when I would watch him use Adobe Illustrator I was both in awe of him and annoyed at how easy he would make things look. He is so good at it that and endless array of features are second nature to him. When I would ask him how to do something I just saw him do frequently it would be harder for him to explain how he does it than doing it.
Our graphics designer is also a quite solid HTML designer, though strictly front end design. One day when I took an early lunch with my wife I asked him to create a Wordpress theme off his HTML design and when I got back he was like … ummm. 🙂
We are all wizards at some things and horrible at others. When I use Adobe Illustrator for even the most basic tasks I feel like a guy going to a breakdancing party with no cardboard and 2 left shoes.
There are a number of things that are great about programming
- it is largely logic-based
- people drawn toward it tend to be smart
- people who can organize code also tend to use language directly (making finding solutions via search rather easy)
Though over time programming languages change features & some changes are not backward compatible. And as some free & open source projects accumulate dependencies they end up promoting the use of managers. Some of these may not be easy to install & configure on a remote shared server (with user permission issues) from a Windows computer. So then you install another package on your local computer and then have to research how it came with a deprecated php track_errors setting. And on and on.
One software program I installed on about a half-dozen sites many moons ago launched a new version recently & the typical quick 5 minute install turned into a half day of nothing. The experience felt a bit like a “choose your own adventure” book, where almost every choice you make leads to: start again at the beginning.
At that point a lot of the advice one keeps running into sort of presumes one has the exact same computer set up they do, so search again, solve that problem, turn on error messaging, and find the next problem to … once again start at the beginning.
That sort of experience is more than a bit humbling & very easy to run into when one goes outside their own sphere of expertise.
Losing the Beginner’s Mindset
If you do anything for an extended period of time it is easy to take many things for granted as you lose the beginner’s mindset.
One of the reasons it is important to go outside your field of expertise is to remind yourself of what that experience feels like.
Anyone who has been in SEO for a decade likely does the same thing when communicating about search by presuming the same level of domain expertise and talking past people. Some aspects of programming are hard because they are complex. But when you are doing simple and small jobs then if things absolutely do not work you often get the answer right away. Whereas with SEO you can be unsure of the results of a large capital and labor investment until the next time a core algorithm update happens a quarter year from now. That uncertainty acts as the barrier to entry & blocker of institutional investments which allow for sustained above average profit margins for those who make the cut, but it also means a long lag time and requiring a high level of certainty to make a big investment.
The hard part about losing the beginners mindset with SEO is sometimes the algorithms do change dramatically and you have to absolutely reinvent yourself while throwing out what you know (use keyword rich anchor text aggressively, build tons of links, exact match domains beat out brands, repeat keyword in bold on page, etc.) and start afresh as the algorithms reshuffle the playing field.
The Web Keeps Changing
While the core algorithms are shifting so too is how people use the web. Any user behaviors are shifting as search results add more features and people search on mobile devices or search using their voice. Now that user engagement is a big part of ranking, anything which impacts brand perception or user experience also impacts SEO. Social distancing will have major impacts on how people engage with search. We have already seen a rapid rise of e-commerce at the expense of offline sales & some colleges are planning on holding next year entirely online. The University of California will have roughly a half-million students attending school online next year unless students opt for something cheaper.
Colleges have to convince students for the next year that a remote education is worth every bit as much as an in-person one, and then pivot back before students actually start believing it.
It’s like only being able to sell your competitor’s product for a year.— Naval (@naval) May 6, 2020
I am horrible with Adobe Illustrator. But one of the things I have learned with that and Photoshop is that if you edit in a rather high resolution you can have many of your errors disappear to the naked eye when it is viewed at a normal resolution. The same analogy holds true for web design but in the opposite direction … if your usability is solid on a mobile device & the design looks good on a mobile device then it will probably be decent on desktop as well.
Some people also make a resolution mistake with SEO.
- If nobody knows about a site or brand or company having perfect valid HTML, supporting progressive web apps, supporting AMP, using microformats, etc. … does not matter.
- On the flip side, if a site is well known it can get away with doing many things sub-optimally & can perhaps improve a lot by emulating sites which are growing over time in spite of having weaker brand strength.
Free, so Good Enough?
Many open source software programs do not do usability testing or track the efforts of a somewhat average user or new user in their ability to download and install software because they figure it is free so oh well people should figure it out. That thinking is a mistake though, because each successive increase in barrier to entry limits your potential market size & eventually some old users leave for one reason or another.
Any free software project which accumulates attention and influence can be monetized in other ways (through consulting, parallel SaaS offerings, affiliate ad integration, partnering with Hot Nacho to feature some great content in a hidden div using poetic code, etc.). But if they lack reach, see slowing growth, and then increase the barrier to entry they are likely to die.
When you ask someone to pay for something you’ll know if they like it and where they think it can be improved. Relying on the free price point hides many problems and allows them to accumulate.
The ability to make things easy for absolute beginners is a big part of why Wordpress is worth many multiples of what Acquia sold for. And Wordpress has their VIP hosting service, Akismet, and a bunch of other revenue streams while Acquia is now owned by a private equity company.
The ability to be 0.0000001% as successful as Wordpress has been without losing the beginner mindset is hard.
Remedy has offered new details about the upcoming AWE expansion in its latest livestream, but one of the coolest announcements to emerge from that video on the status of the game is what is coming in the free August update. Remedy revealed that it’s responding to complaints about long checkpoints and difficulty overall with a number of new options.
The game is adding new control points as well as soft checkpoints in certain areas.
In addition, a new “Assist Mode” provides tremendous flexibility for players to cater the game to their desires, changing out damage input and output amounts, reload and recharge speeds, and even turning off character death entirely.
Many players at launch found Control’s difficulty to be uneven, so this new addition from Remedy should come as welcome news.
The free update is due out this month.
Far-UVC light is a type of ultraviolet light that kills microbes and viruses and, crucially, seems to be safe to use around humans. Radiation scientist David Brenner describes how we could use this light to stop the spread of SARS-CoV-2, the virus responsible for COVID-19, in hospitals, nursing homes, trains and other public indoor spaces — paving the way for a potentially game-changing tool in the fight against the coronavirus pandemic. (This virtual conversation, hosted by TED science curator David Biello, was recorded July 7, 2020.)
Photobucket American photo storage and image hosting website Photobucket is an American image hosting and video hosting website, web servicesRead more
Ning (website) Ning is an online social media network platform for people and organizations to create custom social networks. NingRead more
Mega (service) source-available end-to-end encrypted cloud storage service This article is about the file hosting service. For the website’s predecessor,Read more