technavadmin – Page 6 – Technology Navigation Inc.

Cloud Integration: What is the Best Strategy  

Cloud Integration: What is the Best Strategy  

By Chis Newell
Founder & President

Many of our clients are looking to my team for guidance on cloud services, applications, and transport. Some of the questions asked are “what should be in the cloud?, how does it integrate and how do I get there? The most common question of all is, how can we control costs? “. 

Cloud services and applications that were once bleeding or cutting edge have become mainstream and I am not just talking about cloud compute. Seemingly everything in the communication and compute world has a cloud connection or application.  

Most services and applications have all or some aspect of their product in the cloud. Flashback to 10 years ago, services and applications were primarily kept within their technology and physical location. Today everything is integrated or have API’s. This is causing a strategy issue for companies and organizations. For example, some are wondering if MS O365 integrates with their voice platform, or if they should look at options that would accommodate that, such as upgrading licenses.

Employees may favor Google Drive or Box over MS One Drive and companies question which to standardize on. Recently, a client asked me how Nextiva, Calabrio and Five9 would integrate if they were utilized within the same organization. Most of these technologies mentioned are intertwined. However, choosing the overall correct technology path for an organization can be tricky and potentially costly.  

We have been told for years that cloud services and applications can connect to each other through open API’s, but this strategy has its flaws. While having an open API is one thing, building a working model off that API is another. For example, there are many reasons why CCaaS providers like inContact, Five9, and Genesys are now owned by, developed, or have purchased WFO/WFM providers.  

The data flow and integration of an API was not meeting client requirements and the merger of these two technologies was inevitable. Integration between technologies must have a proven track record. Just by saying there is an open API does not mean anything has been built, tested, and proved out to be a solid integration. Also, companies should not rely solely on API’s being available long term.  

Multiple times, we have seen competing companies purchase supporting technologies and discontinue or limit API’s to their competition, thereby leaving businesses stranded. It is extremely important to roadmap out dependencies and how integration takes place.  

Connecting to cloud applications and services is a basic concept, you just need an Internet or private connection. While this is important, having Internet redundancies is vital for a cloud strategy to work. Redundancies with seamless (packetized) fail-over is required and should be over separate media. For example, if your first Internet connection is over fiber, it is important to use coax, microwave, 4G/5G as your second medium and connection.  

In a perfect world, a second fiber provider coming into the location through a separate entrance facility would be optimal. Internet transport connections need to be configured in a way where there is packetized fail-over between the connections. This is traditionally accomplished with SD WAN technology.  

Controlling costs and Cloud sprawl can be difficult, as cloud services and applications are largely dependent on each other and are third party provider to oversee workload costs or the use of Technology Expense Management (TEM) providers.  

No matter what your strategy is for cloud services and applications, spend time determining dependencies and how your overall organization can control the environment. Senior leadership will appreciate your due diligence. 

Cutting the Cost of Copper Lines 

Cutting the Cost of Copper Lines 

By Chis Newell
Founder & President

Dating back to Alexander Graham Bell, “Copper-Line” facilities are one of the oldest technologies in our industry.  This type of technology can go by many names (i.e., 1FB, 1MB, POTS, Copper Lines, Copper Pairs, Phone Line, etc.).  Although most of us have moved to other technologies for our day-to-day voice service, the fact remains that copper-line services continue to be used in many ways throughout the US.  

Every month, businesses approve charges associated with copper lines without fully knowing what they’re paying for, if they are being charged correctly, or receiving a fair rateThese copper services are typically connected to small offices, elevators, alarms, faxes, security-systems and emergency phones, utilizing ring-down functionality, etc.  

The average cost of a copper line is approximately $50 per month, but many businesses are noticing that phone companies are increasing their fees on these types of services.  RBOC are drastically increasing their copper line facility costs in the efforts to force organizations off the service and decommission copper facilities.   

Some businesses spend tens of thousands, even millions, per year on these facilities.  $50+ by itself may not seem like a large cost however when you consider businesses with multiple locations that are being charged for 3+ lines per location, these costs can quickly add up and potentially impact budgets.  

It’s very common for a business to install lines simply because of poorly maintained inventories or lack of information detailing the location of existing facilities.  Few businesses want to pay someone to tag and locate $50 copper lines.  To avoid the headache of disconnecting existing services, these copper lines are left in place and new services are ordered.  

New technologies exist now that take the copper facilities out of play.  It is called “Pots In a Box” and it is VoIP or SIP services over 4G/5G or broadband to an analog coverter connecting to the 66 block.  This bypasses the copper facilities, and reducing cost significantly.   

One of the many ways to identify and eliminate lines that aren’t needed is to isolate and then disconnect lines with zero usage.  The best way to determine zero usage is to accumulate all invoices from each location with copper lines, which are usually provided by your RBOC like AT&T, Verizon, Windstream or CenturyLink. The intent is to create an inventory of each location with associated numbers, ussage and costs.  Once an inventory is created and costs compiled, cost savings and potential line reduction is possible by working with an aggregator, or a volume line agreement with the RBOC.  

We can all agree that communicating/negotiating with the carriers can be frustrating at times, thankfully GCG has the resources to make this process less painful.  You likely won’t be able to avoid the need for copper lines at your place of business, but you can make substantial reductions to your monthly costs & eliminate a lot of the unnecessary frustrations associated with these types of services. 

Data Center Energy: fear the future?

Data Center Energy: fear the future?

By Chis Newell
Founder & President

We are all aware of the fossil fuels and energy dependency in the datacenter realm. This is based on the real problem of energy and power, where the exponential growth of big data analysis, cryptocurrency, streaming video, Artificial Intelligence (AI), and the Internet of Things (IoT) are all gobbling up every piece of energy available. 

The demand for electricity to power our data processors and streaming services is growing bigger every day, but the supply is not rising to match it. How will the global economy quench the thirst of its own creations in the years to come? 

First Steps 

Initial efforts to curb the massive power surge include companies taking older hardware offline and rerouting those services to more efficient cloud services and server virtualization solutions. Replacing hard disk drives with flash storage units is both an energy and a physical storage improvement; flash units are only needing power when they are actively in use, and completely dormant the rest of the time.  

Of course, while these maneuvers are more energy efficient, they are not free. Plenty of companies would rather continue to continue gobbling up electricity than replace their entire process. Short of installing government mandates, it is difficult to imagine many businesses willingly scrapping their old legacy systems and equipment, buying access to new, virtual replacements, and also paying for all of their employees to get comfortable with the new services. 

The other big push is in the acronym PUE, which stands for power usage effectiveness. In layman’s terms, it is the ratio of the total power required to run a data center versus the specific power involved in the compute and storage operations in that facility.  

A PUE rating of 1.0 means that the data center is 100% efficient. While most big data operators like cloud companies claim their average PUE numbers are between 1.1 and 1.4, some of the industry watchdogs state this may be the case https://datacentremagazine.com/critical-environments/pushing-limits-data-centre-efficiency.  

While some exaggerated claims have been made on how much pollution these facilities are giving off, official statistics and data center energy use are not compiled at either the national or global level. That said, it is undeniable that there is an impact on the amount of pollution generated considering how rapidly it is being used by data centers and processors. 

What’s Next? 

There is no slowing down the industry. Companies are not going to suddenly regress in terms of how much processing power they devote to AI, IoT, and machine learning (ML). Without incentives or regulations the transition to renewable, green energy sources is moving somewhere between the pace of a glacier and a snail, with 80% of the world’s energy still derived from traditional fossil fuels. Unless there are self-made corporate mandates from companies like Apple, there is a need to find a better way. 

Here are two possible ways forward; a massive investment in building more data centers to handle the load and watching fuel costs rise to a tipping point, or the power of human innovation to offer something new. The hope for the latter is clearly the path to a better tomorrow. Considering how far we have come in such a short time; the next solution is out there somewhere. 

EA or CSP? Good Question 

EA or CSP? Good Question 

By Chis Newell
Founder & President

Traditional software licenses or head for the cloud? Is there a different way to procure cloud software/SaaS? These are some questions our clients ask as they look to leverage costs against their projected growth in the short, mid, and long-term. 

For firms using Microsoft products – which at last count is in the neighborhood of 650,000 including a whopping 91% of the Fortune 100 – this decision comes down to choosing between Microsoft’s Enterprise Agreement (EA) and its Cloud Solution Provider (CSP). Deciding which one is best for your business situation depends largely on who you are now and who you want to be in the future. 

The Case for CSP 

CSP Flexibility has become a huge part of many business plans because it can allow them to manage the ebb and flow of a volatile world with a lot less risk. Instead of the typical three-year agreement that is the foundation of EA, CSP allows you a more flexible purchasing cycle, which not only lets businesses have more elasticity with their budgets, but also means they are only paying for the licenses they use each month.  

For companies where there is a dedicated seasonal surge, this is a great way to upgrade the number of users needed for that specific window without having to take a bath on their price the rest of the year. Depending on the number of licenses purchased, CSP can have similar discounts as the EA, making the CSP decision easy from a financial and contractual perspective.  

In addition to the flexibility and license cost that the CSP provides, there are many providers that bake in Advanced or Premier Support, which eliminates the need to purchase a Premier Support contract, as required in the EA structure. Some CSP providers have their support onshore with 24x7x365 access, which is a nice change from the existing MS support model.  

The Case for EA 

EA is the primary licensing vehicle Microsoft has been using. It is for companies with at least 500 licensing stations that want software along with cloud service options for a minimum of three years, typically. It is a big package deal that naturally affords the customer some discounts due to the volume. That is, the more licenses you buy, the less you pay per license, if you can negotiate a better price. As the business scale and need to add or remove additional hardware, online services, or connected devices, you have a once-a-year process called the “true-up” without having to order them individually.  

There is a bit of measured risk with the EA, however. If the expected number of users does not meet what you’ve purchased during the first year, there’s no getting your money back. Most businesses are confident of their growth predictions before making an investment like this; but what happens in the event of something unforeseen and catastrophic like COVID-19 or a natural disaster that stunts or cripples your growth?  

Obviously, risk management is part of any business, but the limitations should be noted. If Azure is part of your company’s package, it is possible to reduce the number of licenses needed without cost starting in the second year, but that does not apply to the rest of Microsoft’s products. 

Conclusion 

Kudos to Microsoft for being cognizant of how different two businesses models for purchasing licenses and support. The choice then is to decide between the long-term EA at a sizable discount or the month-to-month CSP with similar discounts per license but a lot more flexibility and seemingly better support. 

The evolution that brought us here: how communications have changed in 60 years 

The evolution that brought us here: how communications have changed in 60 years 

By John Witcher
Director of Client Engagement

The landscape of office communications is changing at a rapid pace. It appears the days of buying a PBX and Key System are becoming obsolete, in its place are platforms that allow for communication in several ways all tied into a single source.  

Many businesses are moving away from old, technologically disparate communication solutions to Unified Communications as a Service. This is enabling the workforce to be more productive and connected than ever.  

Communication platforms have evolved over time and have accelerated greatly. Here are some of those highlights: 

1960 – 1990 

  • Computer Networking and communications between computers became possible 
  • Email became a new way to communicate  
  • Cell phones were introduced (remember those brick-like cell phones we used to carry around in bags?) 
  • Large PBX’s were developed (if you wanted redundancy, just buy two) 
  • Audio conferencing became more widely used during this time period, and we all learned (or thought we did) how to mute our phones  

1990 – 2000

  • Instant messenger was developed and grew to be one of the most prominent ways to communicate within businesses, friends, and family around the world 
  • Video conferencing was in its infancy stages during this time period, which will explode in the 2000s to present 
  • The first smart device was made, (who remembers IBM’s Simon?)  
  • Firewalls became a vital part of businesses securing their network environments  
  • Texting was released, (remember 15 cents per text?)  
  • Online conferencing and web share were launched 
  • Cordless phones were released (no more 20-foot-long, tangled cords!) 

2000– 2010  

  • Software as a Service was born, which lead the way to productized cloud platforms, including Unified Communications (UCaaS), Contact Center (CCaaS), Infrastructure (IaaS), etc.  
  • The first iPhone was released 
  • Web conferencing was launched, allowing businesses to hold live meetings across the web (remember the first jittery meetings like this?) 
  • Skype was launched  
  • MPLS was pushed to the client edge, allowing for stronger traffic shaping, and quality of service  
  • Social Media platforms became prominent, and businesses started using these platforms for marketing and business growth  

2010 – 2015  

  • UCaaS was further developed with the increase of cloud adoption  
  • Cloud file sharing (such as google drive) and collaboration were launched 
  • O365 was released and Skype was purchased by Microsoft 
  • Providers fine-tuned the online conference by combining HD video, online business meetings, webinars, and mobile capabilities into a single collaborative solution without expensive video conferencing infrastructure 
  • Asynchronous communication for businesses (SMS Text and IM) is more prevalent  
  • SD WAN was productized and released to the mainstream  

2015 – Present 

Most of these services and forms of communication listed above are now consolidated into one single provider or have seamless integration with many other applications. This has enabled users to consume communications on multiple platforms, shift easily between them and create increased collaboration and productivity across business segments  

Asynchronous communication is the norm where you can work on multiple projects at once, while waiting on a response from others.  Voice calls are mostly collaboration discussions and extensions of a digital interaction.   

Considering how quickly we have moved from the first computer networking to today’s version of interconnected communications, can you imagine where we will be in the next 10 years?  

Special thanks to John Witcher who assisted in writing, and editing this blog post.