Sunday, May 5, 2024

Home Blog Page 20

Improve SEO By Removing Your Duplicate Content

0

Duplicate content is like a virus. When a virus enters your system, it begins to replicate itself until it is ready to be released and cause all kinds of nasty havoc within your body. On the web, a little duplicate content isn’t a huge problem, but the more it replicates itself, the bigger the problem you’re going to have. Too much duplicate content and your website will come down with some serious health issues.

I’m going to break this into three parts. In this post, I’ll discuss the problems that are caused with duplicate content. In Part II, I’ll address the causes of duplicate content, and in Part III, I’ll discuss some duplicate content elimination solutions.

Duplicate Content Causes Problems. Duh!

Google and other search engines like to tell us that they have the duplicate content issue all figured out. And, in the cases where they don’t, they provide a couple of band-aid solutions for you to use (we’ll get to these later). While there may be no such thing as a “duplicate content penalty”, there are certainly filters in place in the search engine algorithms that devalue content that is considered duplicate, and make your site as a whole less valuable in the eyes of the search engines.

If you trust the search engines to handle your site properly, and don’t mind having important pages filtered out of the search results, then go ahead and move on to another story… you got nothing to worry about.

Too many pages to index

Theoretically, there is no limit to the number of pages on your site that the search engines can add to their index. In practice, though, if they find too much “junk”, they’ll stop spidering pages and move on to the next site. They may come back and keep grabbing content they missed, but likely at a much slower pace than they otherwise would.

Duplicate content, in practice, creates “junk” pages. Not that they may not have value, but compared to the one or two or dozen other pages on your site or throughout the web that also contain the same content, there really isn’t anything unique there for the search engines to care about. It’s up to the engines to decide which pages are the unnecessary pages and which is the original source or most valuable page to include in the search results.

The rest is just clutter that the search engines would rather not have.

Slows search engine spidering

With so many duplicate pages to sort through, the search engines tire easily. Instead of indexing hundreds of pages of unique content, they are left sifting through thousands of pages of some original content and a whole lot of duplicate crap. Yeah, you’d tire too!

Once the engines get a whiff that a site is overrun with dupes, the spidering process will often be reduced to a slow crawl. Why rush? There are plenty of original sites out there they can be gathering information on. Maybe they’ll find a few good nuggets or two on your site, but it can wait, as long as they are finding gold mines elsewhere.

Splits valuable link juice

When there is more than one page (URL) on your site that carries the same content as another there becomes an issue of which page gets the links. In practice, whichever URL the visitor lands on and bookmarks, or passes on via social media, is the page that gets the link value. But, each visitor may land on a different URL with that same content.

If 10 people visit your site, 5 land on and choose to link to one URL, while the other 5 land on and choose to link to the other (both being the same content), instead of having one page that has 10 great links, you have 2 pages each with half the linking value. Now imagine you have 5 duplicate pages and the same scenario happens. Instead of 10 links going to a single page, you may end up with 2 links going to each of the 5 duplicate versions.

So, for each duplicate page on your site, you are cutting the link value that any one of the pages could achieve. When it comes to rankings, this matters. In our second scenario, all it takes, essentially, is a similarly optimized page with 3 links to outrank your page with only 2. Not really fair, because the same content really has 10 links, but it’s your own damn fault for splitting up your link juice like that.

Inaccessible pages

We talked above about how duplicate content slows spidering leaving, some content out of the search engine’s index. Leaving duplicate content aside for a moment, let’s consider the page URLs themselves. We’ve all seen those URLs that are so long and complicated that you couldn’t type one out if it was dictated to you. While not all of these URLs are problematic, some of them certainly can be. Not to mention URLs that are simply undecipherable as being unique pages.

We’ll talk more about these URLs in part 3, but for now, let’s just consider what it means when a URL cannot be spidered by the search engines. Well, simply put, if the search engines can’t spider it, then it won’t get indexed. The browser may pull open a page the visitors can see, but the search engines get nothin’. And when you multiply that nothin’ the search engines get with the nothin’ they’ll show in the results (don’t forget to carry the nothin’), you get a whole lot of nothin’ going on.

Pages inaccessible to the search engines means those pages can’t act as landing pages in the search results. That’s OK, if it’s a useless page, but not if it’s something of value that you want to be driving traffic to.

There are a lot of problems caused by duplicate content and bad URL development. These problems may be minor or cataclysmic, depending on the site. Either way, small problem or large, it’s probably a good idea to figure out the cause of your duplicate content problems so you can begin to implement solutions that will pave the way for better search engine rankings.

.htaccess Tricks To Speed Up Your Website

Is your unoptimized website bleeding money due to a slow average page load time? This guide will show you how to optimize your .htaccess (apache) file to implement speed improvements.

Just a one second delay in page response can result in a 7% reduction in conversions, and 40% of users abandon shopping carts that takes more than 3 seconds to load, according to KissMetrics. With more users making purchase decisions on mobile devices each year, page load optimization won’t patiently wait on your back burner any longer without affecting your bottom line.

Fortunately, there are several effective tactics to speed up your website without even touching your main website code. Even if you have little experience with .htaccess or server modification, this detailed guide will give you the tools and knowledge to take a bite out of your page load.

What is .htaccess?

Glad you asked. An .htaccess file allows you to modify Apache web server configuration settings without modifying the main configuration file – in other words, you can customize the way the server behaves while keeping the core settings intact, much like using a child theme in WordPress. Most webhosts allow clients to use .htaccess files, but if you’re not sure, check with your host.

How Do You Use .htaccess?
Simply open any text editor and create a new file called .htaccess. Could it really be that simple? Well, yes and no. Most likely, your computer will perceive the .htaccess file as an operating-system file and hide it from view. To see the file, you’ll need to follow a guide like this one from SitePoint to show hidden files. Once you have that taken care of, you’re ready to move on!

Some considerations before you start:
When editing .htaccess files, minor mistakes in syntax can break your site. Therefore, it’s always a good idea to back up any existing .htaccess files (if applicable) before you begin editing. If necessary, you can comment out an existing line by using the # symbol at the beginning.

Some of the common ways an .htaccess file can get broken:

  • Bad syntax – in other words, improperly formatting the code.
  • If you make .htaccess edits through cPanel, they can conflict with changes you made by hand.

With the proper precautions and a reliable source to copy and paste code from, there’s no reason not to take advantage of .htaccess to improve your site.

7 Tricks for Improving Site Speed with .htaccess

On to the good part: how can we harness the power of .htaccess to improve page load time? Try one (or all!) of these 7 tried-and-true customizations:

Turn on content caching
Google recommends caching all static (permanent) resources – including Javascript, CSS, media files, images, and more. Caching saves these resources to the user’s local memory so files don’t need to be downloaded for repeat visits. This modification alone can significantly reduce page load time – not to mention bandwidth usage.

While some servers will cache a few static resources by default, it’s best to explicitly tell your server to cache all of them. More importantly, the default expiration period for cached entities is one hour, while Google recommends a minimum of one month, and even up to one year (but no more than that).

To ensure the server is caching all static resources and for the maximum time recommended by Google, we’ll be using mod_expires. Open .htaccess and paste the following inside:


# Set up caching on static resources for 1 year based on Google recommendations
<IfModule mod_expires.c>
ExpiresActive On
<FilesMatch "\.(flv|ico|pdf|avi|mov|ppt|doc|mp3|wmv|wav|js|css|gif|jpg|jpeg|png|swf)$">
ExpiresDefault A29030400
</FilesMatch>
</IfModule>

You see whether it’s working by viewing the headers for a file on your server. Using Chrome, open the developer tool and go to the Network tab. Reload your page and click on a css file. Make sure you see the expires date in the response headers:

Compress output with gzip
This .htaccess modification compresses the size of the resources as they’re being downloaded to the user’s browser, thereby increasing page load. By default, it won’t compress anything below 500 bytes — which is a good thing, because compression below that size can ironically increase load time.

To utilize this mod, copy and paste the code below into your .htaccess file:


# Enable gzip compression
<ifModule mod_gzip.c>
 mod_gzip_on Yes
 mod_gzip_dechunk Yes
 mod_gzip_item_include file \.(html?|txt|css|js|php|pl)$
 mod_gzip_item_include handler ^cgi-script$
 mod_gzip_item_include mime ^text/.*
 mod_gzip_item_include mime ^application/x-javascript.*
 mod_gzip_item_exclude mime ^image/.*
 mod_gzip_item_exclude rspheader ^Content-Encoding:.*gzip.*
</ifModule>

Turn off directory indexing
This mod hits two birds with one stone, improving speed and privacy.

By default, any visitor can actually look inside any directory that doesn’t have an index file (index.html, index.php, etc.) in it. That means configuration files and other sensitive data could potentially be up for grabs to malicious users.

Unless you want to add a blank index.html file to every folder on your website (and trust future developers to do the same), take the easy road and modify your .htaccess file instead. In the process, you’ll be saving a bit of server resources – especially if you have very large directories. To turn off directory indexing, open your .htaccess file and add:


#Disable Directory Indexes
Options -Indexes

Prevent hotlinking
Have you ever had the option to display an image from another website via URL? That’s called hotlinking, and it actually eats up bandwidth on the host’s server. Thankfully, it’s possible to prevent other domains from hotlinking to your website. To ensure nobody is using your precious bandwidth, add this script to your .htaccess file:


#Prevent Hot Linking
RewriteEngine on
RewriteCond %{HTTP_REFERER} !^$
RewriteCond %{HTTP_REFERER} !^http(s)?://(www\.)?yourdomain.com [NC]
RewriteRule \.(jpg|jpeg|png|gif)$ - [NC,F,L]

Force files to download instead of open in browser
If your site serves a lot of media files, speed is a top priority. If it’s practical for your users to download files to their hard drives once rather than stream repeatedly from your website, this mod will save your bandwidth:


#Force certain types of files to download instead of load in browser
#Only include filetypes that you want to download automatically
AddType application/octet-stream .csv
AddType application/octet-stream .xls
AddType application/octet-stream .doc
AddType application/octet-stream .avi
AddType application/octet-stream .mpg
AddType application/octet-stream .mov
AddType application/octet-stream .pdf

Deny bad bots
Your public website is constantly being crawled and scraped by bots. Some of these bots are essential – they index your site so it will show in search results. However, there are plenty of bots that aren’t so friendly. Spam bots and scrapers might be bogging down your server, using up bandwidth and resources.. We can block bots based on the user-agent they provide.

The script below denies some bad bots, but isn’t exhaustive. Look to AskApache for resources to help identify more bad bots to block, and use our script as a template if you prefer to add more:


#Block Bad Bots
RewriteCond %{HTTP_USER_AGENT} ^WebBandit [OR]
RewriteCond %{HTTP_USER_AGENT} ^2icommerce [OR]
RewriteCond %{HTTP_USER_AGENT} ^Accoona [OR]
RewriteCond %{HTTP_USER_AGENT} ^ActiveTouristBot [OR]
RewriteCond %{HTTP_USER_AGENT} ^addressendeutshland
RewriteRule ^.* - [F,L]

You can test to see that it’s working by changing your user-agent in Google Chrome. In the developer tools, go to Settings->Overrides->Useragent. Setting your Useragent to one of the blocked bots, then visit your site. You should get a 403 denied error.

Deny malicious IPs
Nothing slows down a site quite like a server attack. If you know the IP address of a user who is trying to break into or abuse your website, you can deny a specific IP, IP blocks, or domains with .htaccess:


#Deny Malicious IPs
order allow,deny

#deny single IP
deny from 1.1.1.1

#deny IP block
deny from 1.1.1.

allow from all

Putting It All Together

Eager to use every tool possible to speed up your site in .htaccess? We put everything together for you here:


<IfModule mod_rewrite.c>
RewriteEngine On

#Prevent Hot Linking
RewriteCond %{HTTP_REFERER} !^$
RewriteCond %{HTTP_REFERER} !^http(s)?://(www\.)?yourdomain.com [NC]
RewriteRule \.(jpg|jpeg|png|gif)$ - [NC,F,L]

#Block Bad Bots – This is a small list. You can add bots to it.
RewriteCond %{HTTP_USER_AGENT} ^WebBandit [OR]
RewriteCond %{HTTP_USER_AGENT} ^2icommerce [OR]
RewriteCond %{HTTP_USER_AGENT} ^Accoona [OR]
RewriteCond %{HTTP_USER_AGENT} ^ActiveTouristBot [OR]
RewriteCond %{HTTP_USER_AGENT} ^addressendeutshland
RewriteRule ^.* – [F,L]
</IfModule>

# Set up caching on static resources for 1 year based on Google recommendations
<IfModule mod_expires.c>
ExpiresActive On
<FilesMatch “\.(flv|ico|pdf|avi|mov|ppt|doc|mp3|wmv|wav|js|css|gif|jpg|jpeg|png|swf)$”>
ExpiresDefault A29030400
</FilesMatch>
</IfModule>

# Enable gzip compression
<ifModule mod_gzip.c>
 mod_gzip_on Yes
 mod_gzip_dechunk Yes
 mod_gzip_item_include file \.(html?|txt|css|js|php|pl)$
 mod_gzip_item_include handler ^cgi-script$
 mod_gzip_item_include mime ^text/.*
 mod_gzip_item_include mime ^application/x-javascript.*
 mod_gzip_item_exclude mime ^image/.*
 mod_gzip_item_exclude rspheader ^Content-Encoding:.*gzip.*
</ifModule>

#Disable Directory Indexes
Options -Indexes

#Force certain types of files to download instead of load in browser
AddType application/octet-stream .csv
AddType application/octet-stream .xls
AddType application/octet-stream .doc
AddType application/octet-stream .avi
AddType application/octet-stream .mpg
AddType application/octet-stream .mov
AddType application/octet-stream .pdf

#Ban Malicious IPs
order allow,deny
deny from 1.1.1.1
allow from all

Bing’s Guide To Quality Content

0

Following the Google’s Panda slap, now Bing reasserts it’s stand for quality content as well.

When we think of quality content, Google Search is our the first automated response. However, to reinstate it’s position, Bing’s, Duane Forrester’s blog gives webmasters some tips and tricks to creating quality content to ensure that both the users and the search engines respond to your website.

Unlike Google, that has left webmasters across the globe in murky waters of Reconsider Request, Bing seems to provide us with rather quick and easy to follow pointers that will easily make their crawler conclude that your website has quality content.

Following are the steps Bing suggests you avoid whilst producing content:

“Duplicate content” – don’t use articles or content that appear in other places. Produce your own unique content.

Thin content – don’t produce pages with little relevant content on them – go deep when producing content – think “authority” when building your pages. Ask yourself if this page of content would be considered an authority on the topic.

All text/All images – work to find a balance here, including images to help explain the content, or using text to fill in details about images on the page. Remember that text held inside an image isn’t readable by the crawlers.

Being lonely – enable ways for visitors to share your content through social media.

Translation tools – rarely does a machine translation tool leave you with content that reads properly and that actually captures the original sentiment. Avoid simply using a tool to translate content from one language to the next and posting that content online.

Skipping proofreading – when you are finished producing content, take the time to check for spelling errors, grammatical mistakes and for the overall flow when reading. Does it sound like you’re repeating words too frequently? Remove them. Don’t be afraid to rewrite the content, either.

Long videos – If you produce video content, keep it easily consumable. Even a short 3 – 4 minute video can be packed with useful content, so running a video out to 20 minutes is poor form in most instances. It increases download times and leads to visitor dissatisfaction at having to wait for the video to load. Plus, if you are adding a transcription of your video, even a short video can produce a lengthy transcription.

Excessively long pages – if your content runs long, move it to a second page. Readers need a break, so be careful here to balance the length of your pages. Make sure your pagination solution doesn’t cause other issues for your search optimization efforts, though.

Content for content’s sake – if you are producing content, be sure its valuable. Don’t just add text to every page to create a deeper page. Be sure the text, images or videos are all relevant to the content of the page.”


When looking to optimize your website this comprehensive list of ‘don’t’ seems like a good place to start from. However, some skeptics may question the reason behind Bing emphasis on quality at this juncture; is this guide any early indication towards Bing’s version of Google like Panda update? Hmmm…

Where to Submit Your XML Sitemap

Sitemaps are an ingredient that completes a website’s SEO package. They are certainly still relevant, since they ensure content is not overlooked by web crawlers and reduce the resource burden on search engines. Sitemaps are a way to “spoon feed” search engines your content to ensure better crawling. Let’s look at how this is done.

XML Format

The sitemap file is what search engines look for. The elements available to an XML sitemap are defined by the sitemap protocol and include urlset, url, loc, lastmod, changefreq, and priority. An example DOM looks like:

    http://example.com/
    2006-11-18
    daily
    0.8

Sitemaps have a 10 MB size limit and cannot have more than 50,000 links, but you can use more than one file for the sitemap. A sitemap that consists of multiple files is called a sitemap index. Sitemap index files have a similar, but different format:

  http://www.example.com/sitemap1.xml.gz
  2004-10-01T18:23:17+00:00


  http://www.example.com/sitemap2.xml.gz
  2005-01-01

There are all kinds of sitemaps, ones for web pages, ones tailored to sites with videos and other media, mobile, geo data, and more. As long as it is within the cost-benefit for achieving better SEO, take the time to become familiar with the different types of sitemaps and make one that best fits your website’s architecture.

Location

Sitemaps can be named anything, but convention is that a sitemap will be named ‘sitemap.xml’ and is placed in the root of the site, so http://example.com/sitemap.xml. If multiple files are needed they can be named ‘sitemap1.xml’ and ‘sitemap2.xml’. Sitemap files can also be compressed, such as ‘sitemap.gz’. One can also have sitemaps in sub directories or submit them for multiple domains, but the cases for needing such are very limited.

Submission

Sitemaps are recognized by search engines in three ways:

• Robots.txt
• Ping request
• Submission interface

First, sitemaps can be specified in the robots.txt as follows:
Sitemap: http://example.com/sitemap.xml

The robots.txt file is then placed in the root of the domain, http://example.com/robots.txt, and when crawlers read the file they will find the sitemap and use it to improve their understanding of the website’s layout.

Second, search engines can be notified through “ping” requests, such as:
http://searchengine.com/ping?sitemap=http%3A%2F%2Fwww.yoursite.com%2Fsitemap.xml

These “ping” requests are a standard way search engines allow websites to notify them of updated content. Obviously, the domain (i.e. “searchengine.com”) will be replaced with say “google.com”.

Lastly, every major search engine has a submission tool for notifying the engine that a website’s sitemap has changed. Here are four major search engines and their submission URLs:

Google – http://www.google.com/webmasters/tools/ping?sitemap=

Yahoo! – http://search.yahooapis.com/SiteExplorerService/V1/updateNotification?appid=SitemapWriter&url=

Ask.com – http://submissions.ask.com/ping?sitemap=

Bing – http://www.bing.com/webmaster/ping.aspx?siteMap=

The ping requests do not respond with any information besides whether or not the request was received. The submission URLs will respond with information about the sitemap, such as any errors it found.

If your website uses WordPress or the like, there are great plugins such as Google XML Sitemaps which will do all this heavy work for you: creating sitemaps and notifying search engines including Google, Bing, Yahoo, and Ask. There are also tools for creating sitemaps such as the XML-Sitemaps.com tool or Google’s Webmaster Tools.

As we’ve said before, making sitemaps “shouldn’t take precedence over good internal linking, inbound link acquisition, a proper title structure, or content that makes your site a resource and not just a list of pages.” However, taking just a little bit of time with a good tool will help you complete your SEO package with a sitemap. Take this tutorial and make your site known!

Apple Enables 5G Updates for iPadOS

0

Apple has enabled iPadOS updates over 5G, giving users the opportunity to update their iPads using their wireless data.

In the early days of iOS, Apple did not allow users to download OS updates via their wireless plans. Instead, OS updates required a WiFi connection. As unlimited plans became the norm, Apple changed their stance, allowing OS updates over 4G LTE.

With the iPhone 12, Apple expanded wireless OS downloads to include 5G as well. Now the company has rolled out the feature to the latest 12.9 and 11-inch iPad Pros, both of which support 5G.

Apple currently has three different 5G data modes: Allow More Data on 5G, Standard and Low Data Mode. To update over 5G, users will need to enable the More Data mode.

Allow More Data on 5G: Enables higher data-usage features for apps and system tasks. These include higher-quality FaceTime, high-definition content on Apple TV, Apple Music songs and videos, and iPadOS updates over cellular. This setting also allows third-party apps to use more cellular data for enhanced experiences. This is the default setting with some unlimited-data plans, depending on your carrier. This setting uses more cellular data.

Given the high speeds 5G offers, 5G OS updates are a welcome addition to the new iPad Pros.

Netflix May Be Moving Into Gaming

0

One of the biggest streaming platforms may be making a move into gaming, as Netflix looks for an executive to lead the effort.

Netflix is one of the most successful streaming platforms, with more than 207 million subscribers. As the company continues to look for ways to stay competitive, gaming is a logical area for possible expansion.

According to a report by The Information, Netflix is currently looking for an executive that could head up its gaming initiative. The company is looking to possibly create a service similar to Apple Arcade, a service that will not be ad-supported.

The company all but confirmed its plans in a comment to GameSpot:

“Our members value the variety and quality of our content. It’s why we’ve continually expanded our offering–from series to documentaries, film, local language originals and reality TV,” Netflix told GameSpot. “Members also enjoy engaging more directly with stories they love–through interactive shows like Bandersnatch and You v. Wild, or games based on Stranger Things, La Casa de Papel and To All the Boys. So we’re excited to do more with interactive entertainment.”

Should Netflix’s plans prove successful, it would open an entirely new opportunity for the company, ensuring growth for years to come.

Twitter Pulls Its Auto Cropping Algorithm Amid Bias Issues

Twitter has announced it is pulling its algorithm responsible for automatically cropping images amid bias issues.

Twitter began hearing feedback in October 2020 that there were issues with how the algorithm was functioning, that it was not treating everyone equitably. The company investigated and did find issues with it.

Testing showed there was an 8% difference from demographic parity favoring women. Likewise, there was a 4% difference in favor of white people instead of black. Similarly, there was a 7% difference in favor of white women instead of black, and a 2% difference in favor of white men instead of black.

One area where the algorithm did not appear biased was in the realm of the “male gaze.”

We also tested for the “male gaze” by randomly selecting 100 male- and female-presenting images that had more than one area in the image identified by the algorithm as salient and observing how our model chose to crop the image. We didn’t find evidence of objectification bias — in other words, our algorithm did not crop images of men or women on areas other than their faces at a significant rate

Ultimately, however, the biases were enough to make Twitter reevaluate use of the algorithm.

We considered the tradeoffs between the speed and consistency of automated cropping with the potential risks we saw in this research. One of our conclusions is that not everything on Twitter is a good candidate for an algorithm, and in this case, how to crop an image is a decision best made by people.

TikTok Tackling Cyberbullying With Mass Comment Deletion

TikTok is taking steps to combat cyberbullying by giving users the ability to mass-delete comments.

TikTok has quickly skyrocketed in popularity with users around the world, quickly becoming one of the most popular social media platforms. Unfortunately, as with all social media, cyberbullying can be a major problem. TikTok has been working to combat bullying, giving users the tools they need to fight back.

The most recent feature being rolled out is the ability to mass-delete comments and block accounts. The feature is particularly useful given the current social media climate, where large numbers of individuals can quickly gang up on a single user, overwhelming their account with negative comments.

To manage interactions on a video, people can long-press on a comment or tap the pencil icon in the upper left corner to open a window of options. From there, people can now select up to 100 comments or accounts rather than having to go one by one, making it more seamless to delete or report multiple comments or block users in bulk.

TikTok says the feature is rolling out in select markets, with global rollout happening over the next several weeks.

CEO of TikTok’s Parent Company Stepping Down

0

The founder of TikTok’s parent, ByteDance, is stepping down as CEO amid some of its biggest challenges.

TikTok drew the ire of the Trump administration over privacy and security concerns. The company has had a number of major missteps, including being accused of sending data to China, violating child privacy and running afoul of EU privacy laws. The Trump administration approved a ban against the platform in an effort to force ByteDance to sell TikTok to an American company. Oracle and Walmart eventually emerged as the winning candidates.

Ultimately, the ban and forced sale was tied up in court. The change in administrations added to the chaos, with the Biden administration wanting to evaluate whether a ban was warranted.

With the platform’s future still in question, ABC News is reporting that founder Zhang Yiming is stepping down, to be replaced by Liang Rubo, another co-founder. Zhang said the change would allow “enable me to have greater impact on longer-term initiatives,” although he did not detail what those initiates may be.

Google Will Open First Store in NYC

0

Taking a page from Apple and Microsoft, Google plans to open its first store in NYC.

Apple’s retail stores have been a big part of the company’s success, becoming some of the most valuable stores in retail, per square foot. Although never achieving the same success, Microsoft’s stores were a familiar site in many shopping malls around the country before the company closed the vast majority of them.

Google hopes to strike gold with its own retail store strategy, the first of which will be opened in NYC, in Chelsea. Google’s description of its store sounds very similar to an Apple Store.

The company made the announce on its official blog.

At the Google Store, customers will be able to browse and buy an extensive selection of products made by Google, ranging from Pixel phones to Nest products, Fitbit devices to Pixelbooks and more. Or they can shop online at GoogleStore.com and pick up their orders in store. Throughout the store, visitors will be able to experience how our products and services work together in a variety of immersive ways, which we’re excited to share more about when the doors open.

We’ll have experts on hand to help visitors get the most out of their device, such as troubleshooting an issue, fixing a cracked Pixel screen or helping with installations. It doesn’t matter whether you’re a longtime Pixel user, are curious about our Nest displays or want to participate in one of the how-to workshops we’ll offer throughout the year — our team will be able to provide you with help that’s specific and personalized to your needs. 

Should the Chelsea location prove successful, it’s a safe bet the company will likely expand its retail footprint. In the meantime, the Chelsea location will be open summer 2021.