Technology

Blog, Cetus Solutions, Cloud, IT Solutions, Microsoft, Technology, Uncategorized

Windows 7 to Windows 10: Migration Best Practices


No Comments

So, you’re thinking of migrating to Windows 10 before the Windows 7 end of life cut-off date. As much as your operating system isn’t always something you ponder, letting go of Windows 7 has proven to be a difficult step for a lot of users and, let’s face it, you too. But, with extended support ending in January 2020, it’s no longer something that organisations can ignore. In fact, the longer migration is left the more stressful it will be. It’s important to realise that the times are changing; Windows 10 isn’t a traditional migration by any means. Microsoft has labelled it the ‘final’ OS, by rethinking the old system of new versions every three years. This new ‘evergreen’ method eliminates the need to constantly create something better and new, by updating automatically twice a year indefinitely so that you don’t need to think about it.

While organisations can still enjoy the security of the extended support for a little while longer, it is imperative that a migration to Windows 10 gets completed before the deadline. Forgoing the update will result in an unsecure operating system. Microsoft will no longer offer technical support, software updates, security updates or fixes. Your organisation will be at greater risk for viruses and malware, leaving you open to not only significant fines, but the risk of cyber criminals exploiting the lapse. But why migrate to Windows 10 specifically? Aside from the obvious evergreen operating system, Microsoft has also officially pledged that organisations that adopt Windows 10 are unlikely to face any compatibility issues. To help you embrace the new possibilities of Windows 10, here are the best practices to make your migration as smooth as possible.

This is a transformation, not a migration
Windows 10 is unique in terms of Windows OS as it brings with it an opportunity for organisations to rethink how they do Windows management, by using new modern management features. These offer IT departments the chance to manage PCs in a way similar to mobile devices, which is significant as it allows them to manage all end-user computing devices, regardless of operating system, with the same set of tools. Modern management also allows for anywhere and anytime management, even if they’re off the domain- and it’s easier, lightweight and more modern in terms of management overall.

Pick the right version of Windows 10
With the new version of Windows, Microsoft has made three versions available for customers to choose from.
1. The Windows Insider Program (WIP) offers users the opportunity to be an early adapter of the latest features that will eventually be incorporated into the mainstream version. It’s a way for users to get a sneak peek into what’s in store.
2. The Long-Term Servicing Channel (LTSC) is optimal for users with devices that do not change and are fixed in function, such as point-of-sale (POS), kiosks, bank teller devices and PCs attached to manufacturing or healthcare devices. This version is exclusive to organisations and is not intended for mainstream PCs.
3. The most common version deployed is the Semi-Annual Channel (SAC). This is the one whose target audience is business computers for production and is designed for the most common scenarios. Each SAC release is available for 18 months, its first pilot stage for three.

Getting the right team together
The vast majority of organisations have already successfully completed other Windows migrations in the past. This Windows 10 migration is slightly different, due to the potential impact to a broader audience, so it will require a strong cross-team effort to achieve the desired results. Your team should be made up of a project manager, a technical lead, representation from appdev, and user business units so that their interests can be included. To make sure that the migration runs smoothly, the team should be committed at least part time for three to six months (or even longer), depending on the size of your organisation, the complexity of the project and priorities.

Use standardisation to reduce complexity
PC computing can become fairly complex due to the variables of device types, application updates and user-injected activities constantly changes the makeup of what generally becomes a standard configuration. Migrating to Windows 10 is the best time to eliminate any unnecessary configurations that add to the complexity. Make the most for your IT team erasing needless applications, reducing the number of device types and minimising the variability of user configurations.

Consider different approaches to your Windows migration
There are several ways that you can handle a Windows migration.
PC refresh
This is the first choice for new PCs since there’s no legacy technology to worry about. It can however, cost a bit more, as the OEM image often includes bloatware and is generally incomplete for most users.
In-place upgrade
These are usually popular for Windows 10 since Microsoft made the upgrade process far simpler and easy to manage. Just remember that legacy application capability issues and less than ideal configurations get moved as part of the process.
Re-imagining
Extending the life of PC assets, re-imagining resets the image to a known-good state that has to be tested and vetted properly. It can, however, be expensive as new images need to be created for existing PCs and can take several weeks.
Virtual desktop infrastructure (VDI)
For the last option, VDI allows for high degrees of standardisation in a secure way from a centralised infrastructure. VDI migrations are ideal for organisations whose users have an identical application need, such as call centres or with remote agents. A slight downside, VDI does require infrastructure, which some customers find challenging.

Embrace unified endpoint management
Possibly the most significant opportunity to arise from the Windows 7 end of life is the possibility to adopt a modern IT management style that will not only positively affect your users, but your organisation as well, by leveraging unified endpoint management. It provides numerous benefits across physical devices, while enhancing security through modern configuration management of user policies, which handles the deployment of applications and manages OS patch management activities. This approach allows organisations to manage Windows with the same skills being used today on mobile while unifying activities across all EUC environments.

Sounds good? Here are the minimum hardware requirements to run Windows 10 smoothly; a 1GHz processor, 1 GB (32-bit) RAM, 16GB of hard disk space, a Microsoft DirectX 9 graphics device with a WDDM diver, and- obviously-, a Microsoft account and internet access. Basically, they’re the same as for Windows 7, but with a processor that supports PAE, NX and SSE2.

There is so much more to an organisation than its operating system, but then it’s such a critical part. Here at Cetus, your organisation’s IT is our priority, and with the Windows 7 end of life coming ever closer we’re the best choice for your Windows 10 deployment and support. Make sure to have a chat our experts sooner rather than later, and make the switch to Windows 10 the easiest you’ve ever experienced.

Speak to an expert

Directors-9619Missy Beaudelot – Digital Marketing Executive
With a background in journalism and an interest in all things tech, Missy keeps our social media in check while monitoring our websites and developing our digital presence.

 

Cloud, Microsoft, Technology, Uncategorized, VMware

Prepare Your Business for Windows 7 End of Life


No Comments

Why would you ever change a good thing? Because of end of life, that’s why. Grab your tissues, we’re delving into the topic that’s provoking international furore and mourning; the painful issue of Windows 7 end of life. When it comes to desktop and laptop operating systems for both personal and professional use, Windows 7 has been number one since its release in 2009. Originally intended to be an incremental upgrade to Microsoft Windows, especially after the disaster that was Windows Vista three years prior, it maintains hardware and software compatibility. A major improvement to its predecessor, Windows 7 quickly won the hearts of every computer user worldwide. Sadly, tech hard- and software are not designed to last, and it was only a matter of time before it would see end of life. Though Microsoft has tried to ease the public into the idea of something bigger and better with the release of Windows 8 (which didn’t go down as well as expected) and Windows 10 (which just isn’t Windows 7), the truth is, frankly, hard to swallow.

“But why?” you sob, “We don’t have to let it go, no one would know!” Unfortunately, it doesn’t quite work like that. Well, it could, but it would open your organisation to all of the nasties that would just love the chance to infiltrate your system. The main problem here is support. You might be surprised to hear that mainstream support for Windows 7 ended all the way back in 2015, which means that Microsoft hasn’t been releasing feature update and service packs since then. That might give you some hope, but alas, I’m here to dash those completely. The only reason that you’ve been safely able to use Windows 7 up until now was thanks to the security patches released through the extended support period, which will end on the 14th January 2020.

Technically, you could continue to use Windows 7 after the extended support period, but the risk of a cyberattack would be significantly higher, and you’d have to ask yourself if it would be worth the risk. Microsoft won’t be taking any responsibility for any security breaches that happen to a Windows 7 operating system after the end of life date, and any breach could land you in deep water (and a deeper fine) in terms of the new GDPR regulations. The issue with end of life is the loss of the security updates that fix vulnerabilities that hackers exploit. Any new computer viruses and malware being created would be far more advanced compared to the old security patches. Why put so much money and time into protecting your business data only to leave your organisation wide open to any cyber attack that happens to be the goût du jour? You might think that 14th of January is miles away. And it is, but in terms of IT it’s not long at all.

So, if ever it was time for a spring clean, now would be it! The more time you have to plan, the less disruption your IT team and end users are likely to experience, which is likely to happen if your software is incompatible with your new operating system. Compatibility is likely to be an issue here, as is old hardware (oh look, a new reason for your printer to refuse to cooperate) and your legacy systems. Since it’s effectively changing the very foundation on which your laptops and desktops rely on, it’s important to make sure that your systems remain interoperable with one another. Some applications might have newer, more compatible versions available which will work flawlessly on your updated operating system. Another option is to move your legacy applications to a virtualised computing environment, easing the change. Before jumping straight into an OS change, it’s important to have a strategy in place to prevent everything from going a bit pie-shaped and cause any more tears. The good news is, if you’re relying on a cloud-hosted system, upgrading your operating system shouldn’t be much of an issue at all.

With the release of Windows 10, Microsoft has introduced the ‘evergreen’ Windows as a service model. This is a complete revolution in terms of OS, and it has launched the idea of a ‘final’ desktop release. Instead of having to migrate to new versions every three years, Windows 10 will continuously update every six months or so, providing continued support and patched security. With shorter release periods (only 18 months now), Windows 10 can’t be treated like a traditional version. To make things that much easier, it’s now possible for an in-place upgrade, meaning that you don’t need to migrate your data and reinstall all of your programs. Windows 10 provides plenty of new features for users when compared to Windows 7, such as Windows Autopilot, which automatically provisions and enrols your device when you sign in, provided that you’re connected to the internet.

January 14th next year will be an incredibly sad day for the millions of Windows 7 users who have so far refused to move on. We’re expecting a world-wide shortage of tissues, ice cream and Windows 10, a resounding wail of “Not Windows 7! What is this Windows 10 malarkey?” reverberating in the crisp January air. Here at Cetus we know that change is scary and unwanted. We feel the pain of saying goodbye to Windows 7, but know that when it comes to your organisation, we will help ease all parties into the next operating system step. Make sure to have a chat with our experts, who will provide the tissues during this painful time, and every other OS change in the future. If you want the most in-depth info and latest tips, make sure to register for our exclusive hands-on workshop with VMware in May.

Speak to an expert

Directors-9619Missy Beaudelot – Digital Marketing Executive
With a background in journalism and an interest in all things tech, Missy keeps our social media in check while monitoring our websites and developing our digital presence.

 

Blog, Cloud, Cloud Hosting, IT Solutions, Technology, Uncategorized, VMware

Navigate Hybrid Cloud Complexity With VMware NSX


No Comments

Chances are, if you’ve been thinking about cloud, or you’ve started on your cloud journey, you’ve heard about hybrid cloud. It’s becoming the future of not only the IT departments that run the organisations, but the very culture and fabric of the organisation itself. Employees can no longer afford to be sitting in an office all day; the very definition of a lot of roles requires the flexibility and fluidity of working remotely. With that, organisations are having to modify and modernise along with this new wave of thinking. IT has had to move away from centralised data centres to a model where applications and data are mobile and easily distributed. In the race to stay relevant and dynamic, many organisations have taken the leap from managing a single data centre to adopting cloud- which usually involves more than one. But the trick with all of this digital transformation is meeting the needs of users without jeopardising security. Having so many clouds can be tricky. We’ve already discussed the issue of cloud sprawl storing data all over the place, which makes it somewhat of a challenge to identify it all and ensure that it remains secure.

Cloud promises so much. Remote working, secure storage, content collaboration, scalability, flexibility…. Plenty of yummy buzzwords that organisations want to achieve. Though, they have quickly realised that it’s not as easy as it looks, especially when it comes to incorporating and managing multiple clouds. And it’s not that they’re foolish or unprepared; deploying cloud can be a significant nightmare, full of terrors and bumps in the road, and sometimes it’s just not possible to foresee the need for something until you’re too far gone. In the haste to join the Cool Cloud group (who wouldn’t, there are badges), many organisations just didn’t have the time to address the divergent expectations and demands on IT and the business, leading to misalignment and a lot of frustration.

VMware solves the issue of the multi-cloud with their NSX solution. VMware has been a leader in the IT industry for decades, and is known for providing the modern network virtualisation solution that aligns business and IT around the same objectives, which is pretty important when IT is what drives your business. It’s comprised of several solutions that work together to create a well-rounded solution. NSX Data Centre reproduces your entire network model in software, making it easy for you to create and provision any network topology in seconds, whether they’re simple or complex multi-tier networks. It creates a common operating environment for all of your applications, either on-premises or off. A streamlined workflow is enabled by automation, meaning that you’ll be able to get more work done faster, with security that’s built-in and tied to your apps and data. NSX Cloud provides consistent networking and security for applications that run natively in public clouds, while delivering enterprise-class capabilities- just like micro-segmentation, to enable easy control over your east-west traffic. It results in precise control over your cloud networking and increases network visibility and analytics. NXS Hybrid Connect delivers secure, seamless application mobility and infrastructure hybridity on-premises and in the cloud. This provides high-performance, highly secure and optimised multi-site interconnects.

That all sounds fab, but what does it actually mean for you? What VMware NSX delivers is a unified platform, which is exactly what you want when you’re dealing with complex cloud environments. Since it’s a common operating environment, it provides a single point of control to break down silos and give you more freedom. It delivers high operational efficiency, eliminating manual network configuration and reconfiguration with its super smooth automation, which allows you to quickly grow or shrink applications across environments. With high workload mobility, NSX enables a fast, low-touch migration process that allows you to realise secure, seamless app mobility across all of your sites and clouds. And finally, with NSX technology, you can easily improve standardisation, by provisioning and managing networks and security services within a single management interface, for consistency and scale across all of your environments.

VMware is a leader in virtualisation, that buzzword you’ve been hearing about a lot but figured you’d look into it at another time. It has been a leader in enabling organisations to navigate multi-cloud complexity in this digital era. With the cloud freedom that it delivers, your IT and your organisation can become partners in innovating across clouds, without adding to cost and complexity. Here at Cetus, we love VMware, and we think that NSX is the bees knees (last year we earned VMware’s specialist competency in Network Virtualisation for our experience and expertise in delivering NSX)  If you want a bit of cloud freedom for yourself, make sure to have a chat with our experts, who love a good multi-cloud challenge.

Speak to an expert

Directors-9619Missy Beaudelot – Digital Marketing Executive
With a background in journalism and an interest in all things tech, Missy keeps our social media in check while monitoring our websites and developing our digital presence.

 

Blog, Cloud, Cloud Hosting, Continuum, Security, Technology, Uncategorized

World Backup Day 2019: Losing Data in the Cloud


No Comments

Happy World Backup Day 2019 everyone! Or, as we call it here at Cetus, World ‘Your Systems Shouldn’t Have Gone Down In The First Place’ Day. We say that, because backups are probably one of the most important things you can do your organisation. You know how you feel when you’re strapped for cash and you’d forgotten that you’d popped a £20 into the little zip-up pocket in your bag ages ago? It’s like a little present to future you from past you. Those kinds of presents are important to show yourself how much you care about you. It’s the ultimate love, really. And trust me, you’ll get that same fuzzy feeling- and a whole tonne of relief- when disaster strikes and you’re the one who planned for it. It’s nice to be the hero of the office every once in a while. So, now that you’ve received the obligatory Public Service Announcement (backup today so that you won’t be the April’s Fool tomorrow), lets get into the gritty stuff; losing your data in the cloud.

“What?!” you mutter at me. I’m not crazy, stick with me on this. Trust me, I’m as horrified as you are. Surely, surely, the whole point of the cloud is to pop important stuff in there and sit back and relax as your office/laptop/briefcase burns, knowing that all you have to do is get your hands on another device, log in, and forget about the flames behind you. Ah, you’d be mistaken. Losing your data- even when it’s stored in the cloud- is entirely possible, and it happens more often that you think. And it’s not just Google Drive or Dropbox, it’s your favourite, and critical, SaaS apps.

It happens to all of us; you’re looking through a cluttered folder of documents trying to find something specific and it just gets a bit too much. There are some really useless files in there, clogging up your workspace. Half of the useless ones haven’t been edited or opened in the last five years. Why keep them? There, all deleted. You’re feeling more zen already. If deleting useless files feels this good, maybe it’s time to turn to the minimalist life. Who needs forty different suits? Two will do. Same with shoes. Donate all those old books, DVDs, CDs, the box of unwanted presents from your evil mother in law. You feel the calm setting in, life is that much brighter. Until your manager/supervisor/boss asks you for a particularly important file that you realise, in sudden horror, you deleted in a moment of deranged thinking. Oops. Maybe you need that clutter after all. Other times, you might be collaborating on a project in Sharefile and someone doesn’t pay attention and clicks ‘trash’. Or, that project that you started 12 months ago and was scrapped? Well it turns out it’s received another green light. Too bad you deleted all the work in a moment of frustration and defiance.

But sometimes it’s not you, it’s the SaaS app itself. No, it doesn’t have an agenda against you. They hold large amounts of data that are bombarded with edits and additions. Overwriting is a pretty common issue for SaaS applications, especially when large data sets are imported into the app via a bulk upload. There can also be an issue or two when third-party applications are used to manage the data inside the base SaaS app. A bit like having two opposing football teams. They’re there to do a job, but they’ll slide tackle each other every so often to show dominance.

And then there’s always that absolute twit in the corner who’s always up to something. You know the one; comes in late, the first one out, takes an extra twenty minutes on their break. Does the absolute minimum just to keep the Powers That Be happy. Well, this very person could also delete their own files. Why? Maybe they quit. Or maybe they think that they’re going to get kicked out. Or maybe the feud between them and your manager (the basis of all the office gossip) has reached the point where they just delete important documents out of spite. Whatever it is, those files are unrecoverable, they’re going to hop on over to the next job, and you’re the one who will have to run around picking up the pieces and trying to fit them all together again, before the customer/your scary boss finds out. Not all ‘accidental deletion’ is accidental.

Organisations like yours are so reliant on your IT infrastructure and you depend on seamless access to it anytime. If it all went down you’d probably end up with an office full of expensive, and sleek-looking, paperweights. And the amount of business data your organisation will realistically produce in the next ten years will grow exponentially. With every file created in your business, the more complex and important your infrastructure becomes. And it’s the very same instance for cloud. As I wrote before, at Cetus we take backup and Disaster Recovery very seriously. So much so that we actually have a whole branch to our organisation, Continuum, which is devoted to DRaaS and backups. Traditional disaster recovery methods are basically a series of fallible, interconnected steps. Think about tapes; they don’t exactly stand up over time. Increasing backup windows prevent complete backups being produced. The availability of sufficient compute and storage resource to provide a complete recovery target. Skilled resources not available to enact a recovery or conduct regular tests of the disaster recovery plan. It’s a domino effect of time delays and uncertainty in the case of a disaster. With Continuum, you can tick off several key functions that your organisation needs. We provide a fully-managed failover, regardless of whatever level of backup, archiving and disaster recovery you require. Testing is important, no matter where your data is stored. Continuum completes tests on whatever regularity you require, using a combination of automation and specialist knowledge of your environment. It’s also scalable (what isn’t in the ideal cloud world?), so Continuum will grow with you as your organisation grows over time. By providing business continuity and disaster recovery, you know that, even in the cloud, your data is safe.

Now that I’ve set the panic in you, (and hopefully given you the magic antidote), have a chat with our Continuum backup specialists, tell them that I sent you, and know that you’ll never be an April’s Fool again!

Speak to an expert

Directors-9619Missy Beaudelot – Digital Marketing Executive
With a background in journalism and an interest in all things tech, Missy keeps our social media in check while monitoring our websites and developing our digital presence.

 

Blog, Cetus Solutions, IT Solutions, Technology, Uncategorized

Take Control of Your Data Centre With Hyper-Converged Infrastructure


No Comments

It’s really hard to be part of an IT department these days. Having become the very lifeblood of an organisation, the loss of a work-reliant component of any kind can cause a revolt, scenes from the Apocalypse or even mass mutiny. Whether it’s a device, the network, internet access or email, the failure of just a single one can make your organisation grind to a halt. Let’s face it, no one wants to press something in the effort of fixing an existing issue just to get inundated in phone calls by grumpy users who can’t access anything. I’m sure that there are IT managers up and down the country nodding along, absentmindedly caressing their internal scars from the first ring of the phone and getting hit by flashbacks of an incident that frankly only an IT member can understand.

Finally, some IT genius somewhere in the world wanted to minimise the complexities and frustrations for the rest of the tech community by creating a system so simple, so easy to manage and so flexible that it would make sliced bread look like a stupid innovation. This amazing creation is called hyper-converged infrastructure and IT departments the world over are bowing down to whichever person designed it. HCI promises operational efficiency, reduced costs and manageable scaling that combines all of a data centre’s necessary components (think of your storage, networking backup and everything else) into a pre-packaged unit that can be controlled and managed by a single console. Frankly, I’d say the only reason that we don’t have a national holiday devoted to this mind-blowing achievement is the fact that cloud computing blew in, screaming for attention, and took over the hype. So, I thought it would only be fair to give HCI some of the spotlight and cheerleaders, by giving it the attention it deserves and listing its many benefits so you become Team HCI too.

Buy more to reduce cost
Obviously, anything that reduces cost becomes automatically more interesting to those who make the big decisions and have the Financial Department around their little fingers. When it comes to IT, the organisations that want to invest in any new hard- or software generally pump money into the sleekest, fastest and newest toy that’s on the market. In terms of HCI, you’ll be delighted to discover that the cost of entry of this whole new system is actually lower than you’d expect. That’s because hyper-convergence uses an economic model similar to that of public cloud providers, by using low-cost commodity hardware and scaling the data centre little by little. With this ‘Lego’ building-block approach, your IT can expand as and when you need it to.

Simplify your operations
A day in the life of your IT team involves a lot of admin and everyday tasks. Add in the complexity created by so much tech, and your IT department can quickly become overwhelmed by the little bits, rather than focusing their energy on the projects that really need their attention. Since hyper-converged infrastructure systems consolidate core storage, backup, deduplication and networking, goes without saying that hardware clutter and touchpoints get drastically reduced. All workloads fall under the same umbrella which makes it easier to migrate VMs between different appliances or data centres. With all of that previously pesky everyday admin duties taken care of, the IT department can either get smaller, or redistribute their focus into something more important. Legacy data centre hardware gets folded into HCI, which means that there’s no need to have several specialists in each resource area running around the office- IT staff only need a broad knowledge to apply infrastructure resources to meet individual application needs.

Make automation work for you
Legacy infrastructure has proven to be quite a hindrance in the face of the latest waves of tech modernisation. Because they can be so varied and complex, automation has been impossible to adopt. As hyper-converged infrastructure works on the principle of a software-defined data centre, automation of routine operations can be achieved with centralised management tools implemented. Everything is included in a unified environment, taking the headache of implementing hardware from various manufacturers or product lines. Adapting automation increases efficiency of the IT team, keeping the business as a whole agile and competitive.

Data protection
Organisations are looking for the Holy Trinity when it comes to their IT infrastructure; the ability to work remotely, having the most agile and up-to-date infrastructure, and keeping everything safe. With an organisation’s network no longer confined to the four brick walls of the building, protecting data, applications and workspaces from the evil lurking beyond the perimeter has become even more important, especially in the wake of Gen-V cyber attacks such as WannaCry. Hyper-converged infrastructure is a leader in cloud efficiency and storage, incorporating snapshotting, data deduplication and other data protection features which is exactly what you want in your corner when disaster strikes and you need to recover everything. It offers higher resiliency than traditional legacy systems, with the scale-out model relying on data being spread across multiple nodes throughout a single, or between several, data centres. If one appliance or rack goes down, you won’t have to worry about performance or availability suffering.

Feel the freedom of flexibility and scalability
Another step away from the archaic confines of a legacy system, HCI allows for greater flexibility for your data centre. Since it’s based in software, it takes a ‘Lego block’ approach to scalability, with each HCI appliance a self-contained unit that includes all of the hardware resource your data centre needs. This allows increased compatibility, and a quick expansion of data centre capacity by simply adding on an extra unit or two when needed. There’s no need for complex update plans each time you need to expand, nor do you need to plan years in the future by estimating what compute resource you might need and investing in costly data centres all in one go, just for it all to be old and out of date by the time you need it.

Simplified procurement and support
Since HCI is basically a plug-and-play unit that incorporates pretty much everything you would need in your data centre, it’s quite similar to the offerings of systems integrators. By getting one point of contact for the life of the system, your IT team can cut out a lot of unnecessary faffing about between different vendors and hardware models. This makes it more cost-effective than integrated systems, especially when it comes to upgrades. When the manufacturer makes any upgrades, the units gain the benefits automatically, without having to replace hardware.

At Cetus, we have a long-standing relationships with leading HCI vendors. Have a chat with one of our experts, and experience the benefits of HCI first-hand.

Speak to an expert

Directors-9619Missy Beaudelot – Digital Marketing Executive
With a background in journalism and an interest in all things tech, Missy keeps our social media in check while monitoring our websites and developing our digital presence.

 

Blog, Cetus Solutions, Citrix, Cloud, Cloud Hosting, Technology, Uncategorized

Citrix Workspace: Delivering The End User Promise


No Comments

Citrix is a lot of things to a lot of people. Here at Cetus, we’re a Citrix Platinum Partner- which we shout about at any opportunity. To others, they’re the ultimate Summit event, the paracetamol for an IT headache, or a name heard in passing when referring to ‘the IT system’. Regardless of what Citrix is to you (why not share down below?) we can all agree to one universal thing; Citrix keeps its promises. There was some scepticism when Citrix made some huge announcements and changes during Citrix Synergy last May, promising to deliver them all within the following 90 days. And sure enough, Citrix kept most of them, completely transforming the suite of products on offer.

User experience has become something of a pioneering point for workspace platforms. IT teams have realised that the platforms that they roll out have to be easy to manage and maintain and be user friendly. With the latest updates to Citrix Workspace, this has very much been a focal point for the developers. But that’s not all; Citrix Workspace unifies solutions in a single management plane, reduces complexity and simplifies solution delivery and management. It can be the way to move 100% to the cloud, or help organisations extend their existing on-premises software deployments. And that’s before thinking of the user. So, how exactly has Citrix delivered the end user promise?

All in one
Back in the day, the goal for the IT department was simply to manage and secure users’ apps, desktops and data as easily as possible. That’s all grand and good, until users go AWOL and refuse to adopt the platform- leading to shadow IT and a nightmare for the IT department. Focusing on the needs and wants for the end user is the Citrix approach for this one; by putting the user at the centre of everything there’s less chance of mutiny. By thinking like employees, systems become easy to use and the rate of adoption improves significantly. Who knew it could be so easy to deliver a simple, secure digital workspace?

We all work here
The whole point of Citrix Workspace is to be a productivity tool that provides a unified experience, enhancing the work lives of users, not causing more trouble. There are so many types of user, and we’re not even talking about the distinct difference between remote and office workers. If you stand at the door of your office and look out over all the people that are working away there’s no chance that they all do the same thing and need an identical workspace. To be successful, IT needs to be able to support all of them. Your workspace needs to be able to deliver all of the apps and data that the users need- Windows, mobile, web and SaaS- and all on the device they happen to be using. What makes Citrix Workspace different is the delivery of a consistent user experience and contextual access to apps and data. It allows access that seamlessly changes based on context, maintaining security and great user experience.

Integrating the workspace
One complaint that comes up over and over again is the actual screen that users interact with during a workspace session. Other solutions often deliver a wall of icons. Citrix Workspace delivers an integrated experience for workers, providing access to all of their apps from a single interface, but also easily accessed by a single sign-on. There’s no need to be constantly trying to remember passwords for different apps that all require updates at different times.

Access everything from one screen
How many places do you store files? On your laptop, the network drive, Dropbox, the dusty hard drive that resembles a brick (that needs its own power source?), that trusty USB with a generic logo that follows you from desktop to desktop…. We try and store things where they belong, but it’s not always possible. In other workspaces, you need to launch the app for the file that you’re trying to open first, wandering around your desktop till you find it. But with Citrix Workspace, files are available in the context of the application, so it’s just a case of clicking on the file and letting it do it all for you, locally when on a secure network or with a hosted app on an untrusted network.

We love Citrix and we love Citrix Workspace. Because we put end users as a focal point in our solutions, you can be guaranteed that your employees will work more productively. Have a chat with our Citrix Platinum Partner experts to refresh your workspace, or check out our events page, where we regularly host Citrix events.

Speak to an expert

Directors-9619Missy Beaudelot – Digital Marketing Executive
With a background in journalism and an interest in all things tech, Missy keeps our social media in check while monitoring our websites and developing our digital presence.

 

Blog, Cloud, Cloud Hosting, IT Solutions, Technology, Uncategorized, VMware

Workspace ONE: Everything You Need In A Workspace


No Comments

There has been quite a lot of excitement surrounding VMware’s Workspace ONE since its announcement in February 2016. The modern version of VMware’s 2016 Workspace Suite; Workspace ONE integrates identity management and EMM, modernising end user computing. With regular updates, the platform continues to excite users across the globe; truly revolutionising the flexible ‘any app, any device’ vision that runs the mobile workspace reform.

Workspace ONE is a secure enterprise platform that delivers and manages any app on any mobile device; that integrates identity management, real-time application delivery and enterprise mobility management. It accelerates how workspace services can be delivered, while engaging digital employees, reducing the threat of data leakage and modernising traditional IT operations for the mobile cloud era. And with yet more announcements as recently as November during VMworld in Barcelona, here are the reasons why Workspace ONE really is everything you need in a modern workspace.

Embrace your inner SaaS
Workspace ONE allows organisations to embrace SaaS and mobile apps while supporting existing enterprise applications (think of your legacy systems), and all securely.

Productivity is the name of the game
The best way for your users to be productive is to provide them the tools that give them the freedom to work, while maintaining the right data security and compliance; which Workspace ONE delivers.

Adopting Windows 10
Windows 10 is taking over the enterprise space and, with Workspace ONE, the accelerated adoption of the platform by using the same modern management framework designed for mobile devices.

Authentication that adapts
With adaptive conditional access; ensure the right level of security for each individual user based on authentication strength, data sensitivity, user location and device posture.

Cyber security with a difference
A huge issue for organisations is the users’ disregard for individual responsibility towards cyber security. Workspace ONE delivers improved security and employee engagement, incorporating next-generation automation and insight.

Improved user experience
The new Workspace ONE hub is truly user-centric, streamlining onboarding and user experience across all platforms.

And all delivered as a service
To speed up app transformation, Workspace ONE delivers virtual applications and desktops as a service.

By adopting VMware’s Workspace ONE platform, join the future of work today and combine an excellent user experience with the ease of management for any IT team. Have a chat with our Workspace ONE-centric VMware experts to get you started.

Speak to an expert

Directors-9619Missy Beaudelot – Digital Marketing Executive
With a background in journalism and an interest in all things tech, Missy keeps our social media in check while monitoring our websites and developing our digital presence.

 

Blog, Cloud, Cloud Hosting, IT Solutions, Technology, Uncategorized

How to Avoid the Common Pitfalls of VDI


No Comments

With the explosion of cloud and mobility changing the way people work completely, organisations have had to adopt more fluid digital workspaces to be able to support a dynamic workforce in the era of consumerisation. Digital transformation has quickly become the answer to the wave of innovation and new ideas behind the rapid development of technology. To keep up, organisations need to move from traditional desktop models to the digital workspaces that are tuned for mobility, a workforce using new device form factors and the agile delivery of new applications. Desktop virtualisation is the name of the game in this instance, and in response IT teams leverage virtual desktop infrastructure (VDI) to do this.

VDI is one of those handy little tricks that every genius in the IT department has in their box of tools. For your IT admins, VDI can reduce desktop administrative and management tasks and enable apps to be easily added, patched and upgraded. It also allows your admins to manage security and data protection from a central point of control, which in the long term will provide your organisation with a lower total cost of ownership and improved data protection. There are so many reasons why running desktop operating systems and applications on virtual machines, either hosted on-premises in your data centre or off-premises in the cloud to access via desktop clients or mobile devices, but there can be a catch. Here are the most common pitfalls that organisations come across when adopting VDI, and how you can avoid them.

As with the majority of projects that affect employees, incorporating VDI in your organisation is going to cause change. And for the most part, change is one thing that people really dislike. Clear communication is vital for project success, involving all parties early on will engage users and let them know what’s coming. Set up a schedule for periodic meetings that detail the upcoming changes and what benefits they will bring. It’s important to take a step back and consider exactly what you’re trying to achieve with virtual desktops, before you even start thinking about IT requirements. What are your users’ needs? By involving your users from early on in the process, it helps to manage expectations and understanding, which will result in their accepting of the end solution. Regardless of what project you were adopting, it would ultimately fail if your users have the perception that it won’t meet their needs or expectations.

Applying the right team to your VDI project can be easier said than done. A very common mistake of app and desktop virtualisation projects is to employ a team of virtualisation architects instead of desktop and application administrators. While architects might seem like the logical first step, the reality of VDI is that virtualising desktops and apps is very different from actually virtualising infrastructure. Admins who are skilled in virtualisation typically don’t build their own workloads, instead they focus their skills on operating servers in a virtualised environment. Build a team full of the people in your team who design and manage desktop and application environments. With desktops now hosted in the data centre, it’s important that the storage systems hosting the desktops and the networks used to access them are designed correctly. Involve the storage, server and network specialists in a coordinated and collaborative fashion.

The whole point of deploying VDI is to benefit the user, so it’s important to define your app and desktop virtualisation use cases properly. Use cases are built on types of workers and their job requirements, the applications and devices they use, their requirements for storage and multimedia performance and their network connectivity restraints. Consider the culture and needs of your organisations and your attitude towards the use of infrastructure when defining your workflow requirements. Thinking strategically, such as asking yourself if there are users that require high-definition video, will ensure that users receive the resources and system performance appropriate for the work that they do and the way they perform their tasks. Avoid oversimplifying your users needs, as different workers in the same office setting have varying performance requirements.

Conducting a pre-assessment for your new desktops and applications is crucial to gain an understanding of the workloads that will run in the virtualised client environment and their associated technical requirements. Without one, you’ll design the solution on nothing but assumptions, which will pose a risk when it goes live. By considering the applications that your users are using, how long it takes to launch them and how they perform on a physical desktop, the pre-assessment can be useful in determining how many users are actually using the applications and how it would impact the way the applications are being delivered. It also considers the utilisation of CPUs, memory, disk and network bandwidth in the physical systems, which is crucial in properly sizing the underlying infrastructure.

Before, in a traditional desktop environment, each user had full access to their own disk spindle- or dedicated hard drive, whichever the case may be- so poor bandwidth for WAN sites could be accepted and endured. When you move to VDI, it’s important to understand exactly how it will impact the performance of network bandwidth, Storage Area Network array processor utilisation and display protocols. These variables can affect application performance, which should have already been fully explored through engagement with users. Your users can help you generate realistic Proof-of-Concept or pilot workloads to validate their requirements for graphic bandwidth, storage, I/O and more. If your WAN links can’t provide the bandwidth for a VDI environment, or the latency is too high, you might want to consider local deployments.

Regardless of the size of your organisation, deploying VDI will be a complex procedure. You might have several thousands of employees on hundreds of different applications, including speciality products for particular job functions. You might have existing application deployment strategies for commonly used applications, but speciality apps can easily be overlooked, being easier to just install them for the small number of users who require them. Obviously, the need for a deployment strategy for all applications will directly impact the on the way the virtual desktop environment is designed. If applications are installed on user login a VDI environment can be properly designed to meet the constraint. Do tasks in parallel to help your project team meet your deployment deadlines. Have a clear understanding of how you will deploy, update and manage all applications that are used before considering how applications will be packaged and how wide-ranging updates will impact performance.

Finally, it’s crucial to avoid skipping or mismanaging the pilot project. Failure is a serious risk if your organisation skips the pilot phase, or if you run a pilot that doesn’t produce a clear outcome. It should have clearly defined objectives and a specified timeframe, engaging real users from various use cases to pilot the environment and generate meaningful load data. It’s important not to test the VDI environment on IT administrators first, as they aren’t representative of your entire organisation’s user base. It should engage the desktop support teams to provide end-user support to prevent the project team from attempting to provide 24/7 support to users.

With these tips, you can be sure to design a VDI environment that will provide users with most of the functionality and performance of desktop operating systems and applications along with higher availability and a lower risk of hardware failure. At Cetus, VDI is something that we’ve deployed many times and we like to think of ourselves as experts. If virtual desktop infrastructure is something that your organisation is looking to deploy, make sure to speak to our experts.

Speak to an expert

 

Directors-9619Missy Beaudelot – Digital Marketing Executive
With a background in journalism and an interest in all things tech, Missy keeps our social media in check while monitoring our websites and developing our digital presence.

Blog, Cloud, Cloud Hosting, IT Solutions, Security, Technology, Uncategorized

How to Prevent Cloud Security Threats


No Comments

The end of winter is always a little grim, especially around the UK. The trees are bare, it gets dark way too early, the two-minute dash to the car threatens a soaking and the prospect of sitting outside with a cold beer/cocktail after a long day of work is surprisingly not tempting at all. I hate to be the harbinger of bad news, but it could get worse. Imagine coming in one morning, wind-swept and dripping, to the news that overnight your cloud was victim to a cyber-crime? At that point a cold beer in the snow seems like the best remedy for the situation. Despite everything, the internet isn’t as safe as you’d think, with devious no-gooders lurking around every corner, eager to take advantage of any vulnerability to make a quick buck. It’s especially important when your organisation is connected to the cloud, where your data is internet-facing instead of nicely locked up in an on-prem data centre. It doesn’t mean you should go out and raid Curry’s PC World for enough data storage to build a makeshift data closet though. Here are a few tips to keep your cloud secure, while enabling the innovation, data access and flexibility that you wanted in the first place.

We’re all in this together
The beauty of the network is that it’s all inter-linked. That’s how you can collaborate so easily, and throw your data and apps onto the cloud to access them whenever, wherever. Just remember, every ‘point of entry’ can be the weakest link, so it’s important to educate everyone in your organisation on how to protect themselves to protect the wider network community. It’s important to involve your entire organisation, making them aware that cyber security is just as much their responsibility as it is yours. Unfortunately, the biggest percentage of criminal infiltration comes down to users accidentally letting the cyber criminal in, usually through phishing or malware attempts. Phishing is a bigger threat to your organisation than ransomware is, and it all comes through malicious emails that get acted on. In this super quick blog post, I’ve already covered how to reduce the risk of getting on the phisherman’s hook. Also, it’s important to set up a (non-judgemental) plan for any user who feels that they might have been compromised, without them having to resort to throwing their laptop out of the window. Then you can be sure that they won’t be throwing out hardware willy-nilly, or sweeping incidents under the rug that could cause you harm in the long run.

Secure your data backup plan (just in case)
Data loss is a serious worry, always. Thankfully, in the case of cloud, this worry is slightly less. By storing your data in the cloud, it’s super simple to link your network to a backup storage solution to make sure that whatever happens you’ll be secure. We think that backups are so important that we’ve created a whole extra branch to Cetus; the Continuum Service. I’ve already touched on the 21st century’s answer to tape backup (it’s replication, by the way), and that’s what we offer with Continuum. It’s a full infrastructure recovery, and not just your data- making sure that your underlying server and desktop infrastructure is a-ok to get your business back on its feet in record time. It’s testing, on a regular basis, automatically verifying the integrity of each virtual backup server on its way to cloud, so that you know if something happens there’ll be minimal disruption. It’s a fully-managed service that allows your peace of mind knowing that we have a whole team on the situation, 24 hours a day. It’s a holistic backup and recovery solution that provides local file and VM restoration.

Who has access?
You can build the strongest walls around your building, adopt the most up-to-date firewall and screen every little thing that enters your network, but sometimes it’s important to be weary of the trojan horse plodding through your hallways every day. And by trojan horse, I mean an employee that could be stealing, irresponsibly sharing, or compromising your data. As an IT department, it’s important to assess who has access to what. There is absolutely no reason for Sharon in HR to be able to view or edit financial records, and what on earth could she be doing popping in twice a week at 3am? Establish access controls so that you can manage risk, tying user identities- even external ones- to back-end directories.

It’s important to put security measures in place that will ensure that your data and apps are protected. Why not embrace the latest technologies and adopt a smartphone access control system that will allow you to manage users and assign door access from anywhere?

Encryption and passwords are key
Your cloud can be quite vulnerable. Sitting up there, without the safety of your watchful eye, anything could be happening. It’s a bit like a spaceman floating in the big black expanse that is space, tethered to the International Space Station. He is protected by nothing more than his spacesuit, which shields him from the general elements, but there’s only so much you can do if he gets hit by a meteor. What he needs is an extra layer of bubble wrap and diamond outer shell. In the case of your cloud, this would look less rigidly cosy and more like good passwords and encryption.

In this example, we’re going to look at your spaceman’s bubble wrap; your password. Files in your cloud are zipped and protected with passwords, so it’s important that you choose a strong one. Having a unique password for your cloud is a must- if a hacker gains entry, they would have access to a huge amount of your data so it’s crucial that you keep it as safe as possible. Look into multi-factor authentication (which I’ve already discussed in detail in this post), adding an extra level of protection that you can control with fingerprint and retina biometrics.

If passwords are your spaceman’s bubble wrap, then encryption is his diamond shell, and it’s crucial. Cloud encryption allows your data and apps to be transformed by encryption, shooting up to the cloud securely before being stored. Encrypt at your network’s edge, ensuring that the data gets protected before it even leaves your network. But make sure to keep the encryption and deciphering keys stored away from where you store your data!

So, before you cobble together a makeshift data centre in a panic to protect your information, take a look at the level of security your cloud solution allows. The integrity of your cloud is vital for the health of your business; protecting your data and the access to it should be a priority for IT teams. Contact our Cetus experts to see how we can provide the ultimate protection for your cloud environment.

Speak to an expert

Directors-9619Missy Beaudelot – Digital Marketing Executive
With a background in journalism and an interest in all things tech, Missy keeps our social media in check while monitoring our websites and developing our digital presence.

 

Blog, IT Solutions, Security, Technology, Uncategorized

The New Hero of Cyber Security; Zero Trust


No Comments

The ability to work from anywhere, on any device, has quickly become one of the greatest developments in the workspace of all time. It’s not an exaggeration either. In the UK, 4.2 million people were working from home in 2015. In fact, between 2012 and 2016, the mobile workforce increased by 12.35%, and that percentage is growing exponentially year on year. It’s widely predicted that by 2020, half of the UK workforce will no longer be plonked in an office all day. That means that it’s time to invest in nice shoes and/or new pj bottoms. And while that’s an amazing turn of events, it will cause some significant security concerns for everyone concerned. With so much of your workforce wandering the plains of the UK, your network is no longer secure by actual brick and mortar perimeter.

Today’s increasingly decentralised enterprises have become a bit of a headache for IT, who now have to keep you secure even when you’re not potentially clicking on some dodgy email phishing links. Zero trust has evolved to answer the issue. Back when cybercrime was still all the way at Gen III, most organisations assumed that their security protection was robust enough to keep them safe. Those few who did err on the side of caution deployed security operations centres or other cyber monitoring solutions, but for the most part IT departments assumed that anything inside the perimeter was safe. Oh, but those were far simpler times.

By working on the assumption that any resource in the network might be compromised, zero trust puts monitoring solutions in place so that you have the power to take remedial action if it’s needed. With this new solution, no one service or server is considered more secure than the next. It’s basically a data-centric network design that puts micro-perimeters around specific data or assets, giving you the flexibility to apply more-granular rules can be enforced. It solves the ‘flat network’ problem of hackers infiltrating your network and scurrying around undetected. With the right guidance (you’re welcome in advance) and a little bit of know-how, it only takes a couple of steps to get started with zero trust.

Identifying your sensitive data is the obvious first step. It sounds like an easy way to start the process, but it’s a little more challenging than you’d think. You can’t possibly protect data that you can’t see or know about. You need to know where your employees store their data, exactly who uses it, how sensitive it is and how they, your partners and customers use it. Without knowing all of this, you’re putting your data and your organisation at risk. And you can’t exactly start investing in security controls until you know what it is you’re actually trying to protect. When you have a better idea of what you’re dealing with, it’s time to classify it all. I suggest procuring the help of your most organised member of staff before moving onto mapping your data.

To understand how you’re going to employ zero trust, and therefore micro-segmenting specific sensitive data, you need to know how it flows across your network as well as between users and resources. This is a fun (probably not) exercise to have with your stakeholders, such as application and network architects, to fully understand how they approach information. To give yourself a bit of a springboard, security teams should streamline their flow diagrams by leveraging existing models. A zero trust network is based on how transactions flow across a network, and how users and applications access data. Optimising the flow to make it simpler, and start identifying where micro-perimeters will be placed and segmented with physical or virtual appliances. In a network where the compute environment is physical, the segmentation gateway will usually be physical as well, whereas a virtualised compute environment will deploy a virtual segmentation gateway.

Micro-segmentation is the name of the game after you determine the optimum traffic flow, by determining how to enforce access control and inspection policies at the segmentation gateway. The point of zero trust is to enforce identity rights, so that you can control who has the privileges to access specific data, so it’s important to know exactly which users need to access what data. You need to know more than the source address, port and protocol for zero trust to work, since security teams need to understand the user identity as well as the application to establish access rights. Having created your ecosystem, it’s important to ‘Big Brother’ it to identify malicious activity and areas of improvement. There’s no point only logging traffic if it comes from the internet- god only knows what kind of infectious diseases your network could contract from a wild-spirited USB. With your shiny new zero trust network, the segmentation gateway can send all of the data flowing through it, which includes traffic destined for both internal and external network segments, straight to a security analytics tool that inspects it properly.

Now that you’re the proud owner of a zero trust network, you can rest easy knowing that your network is being monitored effectively. Here at Cetus, we believe that building the best architecture is just as important as keeping it safe. We’re experts in all things datacentre and cloud, so make sure to have a chat with one of our specialists who can help you through all of your security challenges. And while you’re at it, book yourself in for our complimentary security posture review to identify where your organisation is being exposed to the nasty things that lurk on the outside of your perimeter.

Speak to an expert

Directors-9619Missy Beaudelot – Digital Marketing Executive
With a background in journalism and an interest in all things tech, Missy keeps our social media in check while monitoring our websites and developing our digital presence.