?

Log in

No account? Create an account
 
 
29 March 2017 @ 04:23 pm
UptimeRobot vs SoftLayer  
We host 2 PostJobFree servers on SoftLayer (in their Dallas datacenter).
In the last year I started to get more and more warning signs that SoftLayer is slowly decaying (after acquisition by IBM 3 years ago).
So, finally, I decided to check how good is uptime of https://www.softlayer.com/

So I created a new "Keyword" monitor on https://uptimerobot.com
The monitor checks if "Data Centers" wording was rendered into SoftLayer's home page HTML.
UptimeRobot runs that check every minute.

So, how much uptime does the legendary hosting is able to keep for their web site?
According to UptimeRobot, SoftLayer's home page uptime is a pathetic 99%.
That means that there is 1% change that Softlayer home page is down at any given moment.
Among hosting providers, uptime below 99.9% is considered poor, and uptime above 99.99% is considered good.

According to UptimeRobot, when SoftLayer's home page is up, it has average response time of 681.72ms (about 0.7 seconds, which is kind of OK).

To put things in perspective: PostJobFree home page (that is hosted on dedicated server in SoftLayer) has 100% uptime (99.99%+) and 139ms average response time.



So for now our dedicated servers on SoftLayer still work, but if SoftLayer tech team keep deteriorating, they would eventually mess up their core network too, and then it would bring downtime to our servers as well.

So I am looking for a new hosting provider now.
Would you recommend any?

Originally posted at: http://dennisgorelik.dreamwidth.org/128320.html
 
 
 
Сисадмин-любительulrith on March 29th, 2017 09:25 pm (UTC)
Why are you so sure that UptimeRobot is accurate? Did you compare monitoring results with some other sites?

What if there are some network problem in between of them?

And they may don't want to have 99.99 SLA for their home page at all because they don't sell it. As you told, server they actually sell has appropriate uptime level.
Dennis Gorelikdennisgorelik on March 29th, 2017 10:58 pm (UTC)
> Why are you so sure that UptimeRobot is accurate?

Because it matches with my own manual checks.

> Did you compare monitoring results with some other sites?

Yes, I did.
Other web sites have almost 100% uptime.

> What if there are some network problem in between of them?

Network problems between websites?
Internet is a web. There are many ways web sites can connect with each other, right?

> And they may don't want to have 99.99 SLA for their home page

Not "SLA", but keep it a goal for themselves. To prove that they understand what they are doing.

> server they actually sell has appropriate uptime level

It was not them who sold their server.
It was SoftLayer team many years ago. Before IBM bought them.
Yes, they still able to keep that server working, but I afraid that with few more years it is no longer would be the case.
Besides, I would eventually need to upgrade, and upgrade is much more sensitive to technical incompetence than that maintenance of already working server.
Yaturkenzhensirhiv - a handheld spyyatur on March 30th, 2017 04:17 am (UTC)
> Internet is a web. There are many ways web sites can connect with each other, right?

Not that many, and usually only one is actually used at any particular time.

https://en.wikipedia.org/wiki/Routing_protocol

It takes time to detect a failure and switch to an alternate route (if it even exists).
Dennis Gorelikdennisgorelik on March 30th, 2017 05:59 am (UTC)
We are talking about alternative network links between large hubs, not between remote servers on the edges of the Internet, right?

We are talking about links specifically between hubs, because UptimeRobot gets successful response from other web sites. Which means that UptimeRobot has no problem reaching big Internet hub (through that hub UptimeRobot reaches other web sites).

But there is another reason why I know for sure that SoftLayer's web server specificaly was down -- it returned:
---
http://www.softlayer.com/
Http/1.1 Service Unavailable
---
Сисадмин-любительulrith on March 30th, 2017 04:45 am (UTC)
If talk about upgrade I'd prefer you to migrate to cloud (with proper code refactoring). Any hosted solution looks outdated now...
Dennis Gorelikdennisgorelik on March 30th, 2017 05:49 am (UTC)
What's wrong with dedicated servers?
Сисадмин-любительulrith on March 30th, 2017 06:19 am (UTC)
As you told us in this post: you're afraid of this company is sinking; you're afraid of they will be unable to do upgrade safely; you're afraid they do something to your server; you're afraid they will have network issues in the datacenter.

And to get a general idea why clouds are better please consult with Google :)
Dennis Gorelikdennisgorelik on March 30th, 2017 07:17 am (UTC)
With cloud hosting all these issues would be even worse: downtime would be much more likely.
In fact, downtime of cloud solutions on SoftLayer is much worse than downtime on dedicated servers.
I think softlayer.com is hosted on their cloud servers.

Performance of cloud (VPS) solutions is a little bit worse than performance of dedicated servers.

What do you think are the compelling reasons to switch to cloud hosting?
Сисадмин-любительulrith on March 30th, 2017 07:37 am (UTC)
I believe you're completely missing the point. It has no sense to compare "downtime" of dedicated server and a cloud. Actually downtime problem does not exist for cloud applications at all. Or we can say that downtime is by design there. :)

Of course, apps need to be real cloud application and not just a virtual machine moved from a dedicated server to a cloud which has absolutely no sense.

It's unreal to introduce a cloud apps development approach here, but it changes a way you think about your service. See this link to get an idea of what I'm talking about: https://12factor.net/

p.s. You already use cloud service Amazon S3, don't you? So you're already in a cloud actually, but you cook them in wrong way obviously...
Dennis Gorelikdennisgorelik on March 30th, 2017 08:50 am (UTC)
> Actually downtime problem does not exist for cloud applications at all.

Do you mean that there is only a problem of accessing cloud applications time to time?
:-)

---
https://12factor.net/config
Another approach to config is the use of config files which are not checked into revision control
...
The twelve-factor app stores config in environment variables
---
So "twelve-factor" is suggesting NOT to use source control for configuration code?
It is a terrible omission: no code backup, no code review, no history of changes.

Where do you keep configurations for different environments?

> You already use cloud service Amazon S3, don't you?

Nope.
Why should we?
It is an extra dependency that we would have to manage.
What do I do if Amazon S3 is not available?
It also adds delays (relative to storing data in a local database).

We use Amazon SES though.

Edited at 2017-03-30 08:50 am (UTC)
Сисадмин-любительulrith on March 30th, 2017 09:14 am (UTC)
You should educate yourself not argue. But it up to you, of course :)
Dennis Gorelikdennisgorelik on March 30th, 2017 09:19 am (UTC)
> You should educate yourself

That's exactly what I am doing.

> not argue

Arguing - is a vital part of education.
So you contradict yourself.

Why did you skip my questions?
You do not have a good answer?
Not interesting topic?