Story image

Veeam eyes big data; pushes importance of applications and services

12 Oct 16

Veeam is turning its focus on big data – and how to protect it – as companies increasingly seek to harness big data.

Clint Wyckoff, Veeam Software global technical evangelist for technical product marketing, says big data risks being an upcoming loss for any businesses if measures aren’t taken to consider how their infrastructure can handle the volume of data involved.

“It’s definitely an area Veeam is looking into from a data leaks perspective – being able to protect large amounts of object storage,” Wyckoff says.

Wyckoff, who spends much of his time talking with the IT professionals community to evangelise thought-leadership, says another key area of focus for Veeam recently has been the ITIL – IT infrastructure library – framework, and the importance of IT professionals – both in house and resellers – understanding that ‘the most important thing to any business is applications and services’.

“That’s what enables the business,” Wyckoff says.

“IT needs to enable them and they need to understand that and be responsive to what the business’ needs are, whether that is from deploying apps and services, spinning up VMs, making sure there is enough capacity in the environment to withstand, for instance, a peak holiday season.

“That’s one of the first pillars of what cloud computing is about – the elasticity, the ability to expand out and contract back based on business demand.”

The vendor has seen high uptake of its Veeam Cloud Service Provider offerings across Australia and New Zealand

Wyckoff advocates regular meetings with the business to ascertain what different departments have coming up in the future that IT needs to know about to ensure it is adequately prepared.

He says Veeam offers a holistic view of not just the physical environment, but the virtualised environment, providing users with deep, application monitoring.

“If you think about the purpose of [Microsoft] System Center – providing network operations with green light if the system is good, break fix type activities – we allow that relationship from the application level. Is it running SQL Server, what applications are running on here?

“And we can create that relationship down to the virtualised environment related to data stores and things of that nature,” he says.

“But the most important part of that is ensuring applications are available and putting SLAs around them. That’s one of the large focuses of my background because I was successful in doing that as an end user.”

When it comes to cloud, Wyckoff says companies need to look beyond the ‘low hanging fruit’ of backup and disaster recovery and consider how data backed up off site can be utilised in different ways.

“Backup disaster recovery is easy to get off site because it is non disruptive to any business processes – you’re just sending business data out to a secondary location.

“And there are a lot of different ways that can be utilised. I can use it for development environments, I can use it for test infrastructure, perhaps as a failover mechanism out to these secondary data centres, where I can test disaster recovery.”

That testing is something he cautions IT departments and resellers to be focused on.

“You need to be not only making sure you’re backing up and protecting things, but how often are you testing to make sure that the backups and recovery work.”

However, he says IT still ‘really struggles with that off site piece’.

“It’s really easy to back it up locally, I can do all that, but how do I get it offsite? Some people use tape, some people use a colocation in another data centre where they have for instance a cage of equipment, others look to third party to do that.”

“If I’m sending back ups offsite, is there an infrastructure there for me to restore so I don’t have to pull it back down, do I have hardware sitting there idle ready to restore to? Or if I do need to pull data back down, what is the recovery time going to look like”

“It’s just making sure you understand what the recovery time objective, so if I’m sending off my most mission critical information to a site where there’s no hardware and I have to pull it back down, that could take days if it’s a large set of data. Is the business ok with that? If yes, then that meets my businesses requirements. If no, then maybe I need to figure something else out.”

Gartner names newcomer Exabeam a leader in SIEM
The vendor landscape for SIEM is evolving, with recent entrants bringing technologies optimised for analytics use cases.
52mil users affected by Google+’s second data breach
Google+ APIs will be shut down within the next 90 days, and the consumer platform will be disabled in April 2019 instead of August 2019 as originally planned.
Genesys PureCloud generates triple-digit revenue growth year on year
In Australia and New Zealand, the company boosted PureCloud revenue by nearly 100%.
Symantec releases neural network-integrated USB scanning station
Symantec Industrial Control System Protection Neural helps defend against USB-borne cyber attacks on operational technology.
IDC: Standalone VR headset shipments grow 428.6% in 3Q18
The VR headset market returned to growth in 3Q18 after four consecutive quarters of decline and now makes up 97% of the combined market.
Open source will be the next big thing for the channel
Channel firms should be on the lookout for opportunities across open source and more diverse software offerings like software-defined containers and storage.
Gartner names LogRhythm leader in SIEM solutions
Security teams increasingly need end-to-end SIEM solutions with native options for host- and network-level monitoring.
NBN Co rolls out 'optimised' wholesale business bundles for ISPs
“We recognise some businesses are on nbn powered plans that have not been optimised for their needs," says Paul Tyler.