"IT Transformation can result in bottom-line benefits that drive business differentiation, innovation, and growth" according to research recently conducted by Enterprise Strategy Group (ESG) and announced by Dell EMC. This study aimed to evaluate how today's enterprises are using technology, how they are using it, and the effects they are seeing. The report developed these conclusions:
- 81% of survey respondents agree if they do not embrace IT Transformation, their organization will no longer be competitive in their markets, up from 71% in 2017
- 88% of respondents say their organization is under pressure to deliver new products and services at an increasing rate
- Transformed organizations are 22x as likely to be ahead of the competition when bringing new products and services to market
- Transformed organizations are 2.5x more likely to believe they are in a strong position to compete and succeed in their markets over the next few years
- Transformed companies are 18x more likely to make better and faster data-driven decisions than their competition and are 2x as likely to exceed their revenue goals
This report revealed an interesting new trend - companies are competing more and more by how they use technology. This is easily seen in your workday lunch trip; on a busy day would you be more likely to order lunch from a restaurant that offered easy mobile ordering and payment or one that required you to order at the counter and wait while your food is made and packed? All other variables the same, consumers are demanding more flexibility, convenience, and speed and all of those parameters require technology. And, the business that embraces technology enjoys an additional injection of business data in strong records of user profiles, order history, and a captive market for promotions.
This study, commissioned by Dell EMC, is a followup to a similar study performed last year and is showing that more enterprise decision makers are realizing the importance of IT for their business and reaping the benefits of IT transformations.
“Companies today need to be agile to stay competitive and drive growth, and IT Transformation can be a major enabler of that,” said John McKnight, Vice President of Research, Enterprise Strategy Group. “It’s clear that IT Transformation is increasingly resonating with companies and that senior executives recognize how IT Transformation is pivotal to overall business strategy and competitiveness. While achieving transformation can be a major endeavor, our research shows ‘Transformed’ companies experience real business results, including being more likely to be ahead of the competition in bringing new products and services to market, making better, faster data-driven decisions than their competition, and exceeding their revenue goals.”
The survey respondents were grouped into a variety of IT transformation "maturity stages":
- Stage 1 – Legacy (6%): Falls short on many – if not all – of the dimensions of IT Transformation in the ESG study
- Stage 2 – Emerging (45%): Showing progress in IT Transformation but having minimal deployment of modern data center technologies
- Stage 3 – Evolving (43%): Showing commitment to IT Transformation and having a moderate deployment of modern data center technologies and IT delivery methods
- Stage 4 – Transformed (6%): Furthest along in IT Transformation initiatives
And of those many are working to improve their IT efforts, though the companies in the Transformed category are becoming the most successful in rolling technology initiatives into their workflow. History has shown rewards for businesses willing to embrace technology, and as more adopt new technology the benefits will hopefully be seen across the market.
To read more about this study please see the Dell EMC Press Release.
Any IT professional knows that keeping system patches up to date is critical, especially since patches address critical functions like security and stability. With recent security issues such as Spectre and Meltdown, the importance of patches is becoming more important, but even patches pose risks at times. It takes an informed IT team to manage which patches are appropriate and what techniques are needed to install them.
Patch management isn't as easy as installing whatever recommendations your system offers - there's much more to it than that. IT teams must take into account the specific (and often custom) architecure of their environments, integration points, and how hardware and software could be affected. At worst, ports could be closed, infrastructure could be disabled, and the system could fail. To combat this, six steps in a proper patch management plan should be taken. They are:
- Make a priority of patch management: By prioritizing patch management over other IT tasks, team members have the necessary time to install and test patches.
- Keep an up to date inventory: IT needs an accurate accounting of every asset in the inventory to know which patches are needed and when. It's suggested that enterprises work towards hardware and software standardization to make this task easier.
- Test, test, test: Evaluate systems for potential patch conflicts, and then methodically install said patches with plenty of testing afterwards. If possible, build a test environment isolated from your production environment.
- Accept that it will be difficult: The process of patching, especially on a large system, will not be easy. However, accepting that the process is a necessary evil for the safety and stability of the system will make it all the more palatable to complete.
- Put someone in charge: Any IT tech can install patches, but it's important to maintain accountability for patches being managed. By assigning an IT tech to oversee patches there's a better chance that patches are installed, tested, and updated.
- Keep a record: Having detailed records of the who/what/when/where/why of patch maintenance makes future maintenance easier, especially if there's turnover in your IT department.
These suggestions can be more easily accomplished via patch management software (either commercial or bespoke), or, for smaller enterprises, strict adherence to a comprehensive patch policy. Making a good habit of patch disclipline not only helps to proactively safeguard systems, it familiarizes IT departments with the process when crisis hits and patches are needed immediately. By doing this, systems can be kept online and in production.
This article was based on an April 11, 2018 IT News article by Mary K. Pratt.
Ransomware is a menace to data security - locking up a user's files and throwing away the key until a ransom is paid - and it's now showing up in the cloud. Microsoft is working to remedy this with new protections for business and personal OneDrive accounts, including the ability to revert back to previous images of a user's account, up to a month in the past. This is a measure that helps users easily recover from compromises without the need of professionals, however it does require users to act before that month's time is up. Helpfully, Microsoft is also installing monitoring utilities that notify users when a suspected infection has occurred.
The protections have already been enabled for OneDrive, and Outlook.com will see its updates in the next few weeks. Users that subscribed to Office 365 also will see additional protections for shared and read files distributed via the platform, such as being able to password protect their access and preventing links to files from being forwarded.
"With the growing presence and sophistication of online threats like viruses, ransomware, and phishing scams, it’s increasingly important to have the right protection and tools to help protect your devices, personal information, and files from being compromised,” Kirk Koenigsbauer, the corporate vice president for Office, wrote in a blog post.
Given ransomware's method of infecting every system it can reach on a network and cloud accounts with open connections, Microsoft recommends backing up all of your important files to OneDrive now, before there's a problem. Otherwise, users will lose them when their computer ultimately has to be wiped to clean up the ransomware. Even if a ransomware attack does reach your OneDrive account, you have the option to roll it back.
Microsoft has also announced plans to roll out proactive URL-checking software for Office products later this year - these utilities will scan documents for suspicious links and notify users to remove them.
This article was based on an April 5, 2018 IT News article by March Hachman.
SpaceX, the brainchild of Tesla founder Elon Musk, recently secured permission from the Federal Communications Commission to provide global satellite broadband services. This approval, the first of its kind, involves a fleet of 4,425 low earth satellites that utilize new technologies, far more than the 1,419 active satellites than are orbiting the Earth today. The system is intended to bring high speed broadband service to residential, commercial, institutional, governmental, and educational users worldwide, and even stands to get internet services into remote areas of the world.
Typical broadband satellites weigh in at several tons and can be as large as a bus, but these satellites are different. SpaceX's satellites are estimated to weigh about 850lbs and be roughly the size of a small car. They'll orbit the Earth at an altitude of 715-790 miles. With the quantity anticipated to be sent up, the satellites can cover an area of about 1,300 miles wide (or roughly the distance between Maine to the Florida panhandle).
To start, SpaceX intends to send 1,600 satellites into space and then follow them up with a second phase of 2,825 satellites placed in four shells at different altitudes. Services will be limited at first, but once all satellites are in place the system can support high speed (1 Gbps per user) internet globally. Currently, the average speed of internet speed is 5.1 Mbps, about 200 times slower than SpaceX's offering, which highlights how remarkable the satellite system could be.
SpaceX's filing also includes the following specifications, as noted in a Business Insider article on the topic:
- High capacity: Each satellite in the SpaceX System provides aggregate downlink capacity to users ranging from 17 to 23 Gbps, depending on the gain of the user terminal involved. Assuming an average of 20 Gbps, the 1600 satellites in the Initial Deployment would have a total aggregate capacity of 32 Tbps. SpaceX will periodically improve the satellites over the course of the multi-year deployment of the system, which may further increase capacity.
- High adaptability: The system leverages phased array technology to dynamically steer a large pool of beams to focus capacity where it is needed. Optical inter-satellite links permit flexible routing of traffic on-orbit. Further, the constellation ensures that frequencies can be reused effectively across different satellites to enhance the flexibility and capacity and robustness of the overall system.
- Broadband services: The system will be able to provide broadband service at speeds of up to 1 Gbps per end user. The system's use of low-Earth orbits will allow it to target latencies of approximately 25-35 ms.
- Worldwide coverage: With deployment of the first 800 satellites, the system will be able to provide U.S. and international broadband connectivity; when fully deployed, the system will add capacity and availability at the equator and poles for truly global coverage.
- Low cost: SpaceX is designing the overall system from the ground up with cost- effectiveness and reliability in mind, from the design and manufacturing of the space and ground-based elements, to the launch and deployment of the system using SpaceX launch services, development of the user terminals, and end-user subscription rates.
- Ease of use: SpaceX's phased-array user antenna design will allow for a low-profile user terminal that is easy to mount and operate on walls or roofs.
- The satellites will last between 5 years and 7 years and decay within a year after that.
There are other players in the high speed broadband market too - Google parent company Alphabet is currently working on a broadband system utilizing satellites, balloons, and drones, though it doesn't have the investment that SpaceX has received.
This article was based on a March 29, 2018 Reuters article and a November 16, 2016 Business Insider article by Dave Mosher
Amazon was recently in the news for its new grocery store concept called Amazon Go in Seattle. This store, the first of its kind and a technology testbed for Amazon, uses a variety of cameras and sensors to track items shoppers select, and then automatically charge their credit cards once they leave the store. All the customers have to do is check in via smartphone at a kiosk at the entrance. No waiting in lines, no cashiers, just grab your items and leave (which, admittedly, feels wrong for many shoppers).
So, how's it Amazon Go-ing? Surprisingly well, says Amazon Vice President Gianna Puerini. A majority of traffic is coming from nearby offices because it's so simple to grab a drink or lunch and return to work. In fact, store associates spend most of their time restocking shelves, an easy metric to watch as to the success of the store. Granted, the store is barely exiting its period of being a novelty but Amazon is noticing repeat visits. The question remains - will Amazon expand this technology to its recently purchased Whole Foods? No, according to Amazon, but analysts are suspicious.
Amazon Go is under close scrutiny from its parent of course, with the metrics being closely analyzed to drive future business decisions and expansions, but also by the retail industry at large to see how viable the format is. Research-focused venture capital firm Loup Ventures wrote of Amazon Go last month: “Our experience was flawless, leaving us increasingly confident that Amazon is best positioned to own the operating system of automated retail. Eventually, we expect Amazon to make this technology available to other retailers.” But, the technology still needs improvement, a task that is more easy to handle with one shop that has a team of engineers waiting in the wings to complete fixes. In fact, their current task is teaching computers to learn retail items more quickly, possibly simply by showing the product to a camera. Us humans can learn to recognize an item with only two to three views of it, so why can't the computers?
With time we'll see what Amazon is able to accomplish with their pet project, but for now it's an interesting observation into computers having to take on unfamiliar roles.
This article was based on a March 18, 2018 Reuters article by Jeffrey Dastin.
PaperFree partner OpenText recently announced an addition to its 2018 OpenText Enterprise World lineup - the inaugural Women in Technology Summit. The conference will be held in Toronto, Canada this July and is a follow up and expansion to the wildly popular 2017 Women in Technology Luncheon at Enterprise World. The Summit is inspired by International Women's Day and Press for Progress and will feature speakers who will share thoughts on the challenges of achieving diversity in the technology sector, as well as keynotes on the progress that has been made and its positive effects on the workplace.
“OpenText is honoured to create a new leadership forum at OpenText Enterprise World, the Women in Technology Summit,” said Mark J. Barrenechea, Vice Chair, CEO and CTO at OpenText. “Innovation is at the heart of OpenText, and diversity is a key element of innovation. The Women in Technology Summit will bring leaders together from across North America to further advance innovation through diversity.”
The summit will feature Chief Technology Officer of the United States Megan Smith, Stephanice Lampkin, CEO of Blendoor, and Jodi Kovitz, CEO and founder of #MovetheDial. The Summit will also feature two panel events that will be moderated by prominent broadcast journalist Marci Ien, who currently co-hosts The Social. The panels will focus on topics including advancing women in technology to positions of leadership as well as influencing the future of women in STEM careers. The panels will feature:
- Pearl Sullivan, University of Waterloo Dean of Engineering
- Mei Dent, OpenText Vice President of Research and Development
- Danny Allen, SAP Labs Vice President of Tech Diversity and Inclusion
- Anar Simpson, Technovation Global Ambassador
- Saadia Muzaffar, founder of TechGirls of Canada
- Lucia Melgarejo, SAP UX Designer
The summit will be held Thursday, July 12, 2018 from 9am to 5pm at the Metro Toronto Convention Centre and registration is open now via OpenText.
This article was based on a March 7, 2018 OpenText press release.
Malaysia Airports is in the midst of a five year business plan to improve passenger efficiency and experience at its main airport, Kuala Lumpur. Among these plans is the Total Airport Experience strategic pillar - a project that involves an immense data analytics and digital platform and PaperFree partner OpenText has been chosen as the technology partner for this endeavor. OpenText will be involved with developing intelligent automation, predictive analytics, and expanded digital services for passengers at the airport, which is one of the busiest in Southeast Asia with 52 million passengers annually.
This unified digital platform aims to more easily distinguish engagement with passengers versus other customers, improve operational efficiency, improve passenger productivity and experience, and open new avenues for revenue streams. Using machine learning, OpenText's solution will analyze data from existing systems to provide in-depth insights for better decision making. The solution includes content and big data management technology, as well as service quality management, customer experience management, and asset management services.
The project begins with a mobile app for passengers that helps them plan their journey through the airport. This app serves to notify users of changes to their flight, congestion levels, expected trip times through the terminals, as well as alternate options for getting around. It also offers self check in and self bag drop functions. In the future the app's functionality will be expanded to include airport shopping options with delivery to the gate. Not only does this technological upgrade benefit the airport, it benefits the airlines and vendors that use its facilities too. Now, checked in passengers' locations will be monitored and provided to airlines in an effort to reduce flights delayed by missing passengers and offloading luggage.
However, one department that stands to see immense improvement is the immigration department - new tech will allow them to cross check passenger documents against flight manifests before passengers even pass the new security facilities. Anyone who has ever passed through immigration at an airport can attest that it is one area of travel where improvements to efficiency are most welcome. OpenText's solution also specifically addresses wait times by predicting queue loads and allocating resources appropriately.
“OpenText’s proven track record of executing large-scale digital transformation programs, built on our leading Enterprise Information Management platform and services, will enable OpenText to help Malaysia Airports with its vital transformation strategy. As a global hub for commerce, Kuala Lumpur International Airport is truly embracing a progressive model for the airport of the future, one that we at OpenText is excited to play a major role in” said Mark J. Barrenechea, Vice Chairman, CEO and CTO, OpenText.
This article was based on a February 27, 2018 OpenText press release.
Los Angeles is known for its soul-crushing commutes, but strangely enough, its residents largely don't rely on public transportation to ease the crunch. This is likely because gas prices are still low, and the fact that busses sit in the very same traffic as everyone else. And, light rail and underground metro systems in the area can't hold a candle to the more accessible systems of other large cities like London and Tokyo. Given this, it's easy to see why the average Angelino still reaches for the car keys every morning.
The Los Angeles County Metro bus system, citing falling ridership, is aiming to address this with the use of technology that could make taking the bus faster than it used to be. This project, estimated to run over the next few years, starts with adding wifi connectivity to its 2,200 busses by 2019. While this not only enables productivity while riding, it adds new capabilities to the transit management system. With wifi, busses are able to report their location every five seconds, an improvement over the previous radio system's three minutes. With more accurate data, planners can more efficiently plan routes, anticipate bottlenecks, and even sneak in an extra bus or two. While customers themselves won't be tracked, their wifi use will so that planners have a better idea of demographics and interests by bus location and route, which they can put to use in selecting content for digital ads on busses.
The second step in the digital transformation is to prepare the system to be autonomous-ready. Passengers rely on drivers for information and to report problems, but without a driver customers are limited. Metro has this covered with a recently released smartphone app that is based on Microsoft Azure cloud technology. This app has extensive, detailed information about bus routes and stops, as well as the ability to connect with a transit agent for specific questions. It also features an emergency reporting feature, which when activated by a passenger alerts Metro, law enforcement, and relays video from ten on-board cameras with geolocation data.
Wifi and automation is great, but they don't necessarily get you to where you're going any faster. That's where Metro's Orange Line comes in, an 18 mile route in San Fernando Valley. This route has a dedicated bus lane, but busses still must stop for red lights like anyone else. Metro is currently evaluating data from traffic signals and connected busses on the route to see if tinkering with the bus' speed can help it to catch more green lights, thereby reducing the time waiting and time to destination. In addition, busses run more cleanly when they aren't having to come to a stop and start again constantly.
With these improvements the Los Angeles Metro bus system is hoping to not only improve ridership numbers, but also do its part to help reduce congestion in the area. “All that work makes buses safer, faster, more efficient and more responsive to customer needs, and hopefully that improves our ridership,” says Doug Anderson, senior director of digital strategy and innovation at Metro.
This article was based on a February 14, 2018 Microsoft Feature by Vanessa Ho.
Tesla, the manufacturer of popular electric vehicles and an even more popular orbiting Roadster, recently announced that its Amazon Web Services account was hacked and used to mine cryptocurrency. The company was first informed of the breach by cybersecurity firm RedLock, who found the compromise while simply surveying Tesla's platform for vulnerabilities as part of Tesla's bug bounty program (RedLock was compensated for their find). The breach was sourced to a simple IT administrative console that didn't have a password, though Tesla has not been able to determine who was behind the attack or how much cryptocurrency was mined. Tesla was able to remedy the vulnerability and stop the attack within hours of being notified by RedLock.
Amazon Web Services is the popular cloud storage division of the online retailer, and it is one of the company's most profitable services. However, its accounts have become vulnerable to hacking for the purpose of "cryptojacking", a practice in which hackers use the system to mine cryprtocurrency. Mining cryptocurrency has become more and more lucrative recently.
Tesla vehicles are known for being ultra-connected to the company's home servers, so what does this security breach mean for the vehicle owners and their safety? Fortunately, not very much. It appears that the breach was limited to Tesla company data only, such as data relating to the company test cars. However, some of Tesla's proprietary data relating to mapping, telemetry, and vehicle servicing was compromised and could fall into the hands of competitors. Additionally, the breach highlights a common concern of autonomous driving systems (a mild form of which Tesla offers in its Autopilot) - the fact that the computers controlling vehicles can be compromised and give unsafe or incorrect commands to vehicles.
As is always the mantra when it comes to data security, secure, secure, secure, and secure some more. Hackers are becoming more creative and aggressive in their attacks and even end users can take simple steps to help thwart data loss.
This article was based on a February 20, 2018 Business Insider article by Mark Matousek.
Many enterprises are finding it necessary to evaluate their storage and processing needs, even so far as considering upgrading mission-critical systems to hyper-converged infrastructure (HCI). However, Dell EMC says "Not so fast". While they encourage transition of workloads to HCI, the technologies aren't quite ready for critical systems. However, the end of 2018 will bring updates to the tech that means that HCI and more traditional architectures should align.
The "SAN-less" HCI infrastructure market is hot right now, and the tech is finding favor in datacenter applications that can take advantage of its exceptional scalability. HCI lends itself to general IT use and agile environments; such is Internet of Things (IoT) and analytics. However, "traditional, monolithic applications with large relational and non-relational databases, such as SAP, Oracle, the patient records system Epic" are best served on SAN-based storage at this time. Why? Simply put, mission-critical applications rely on underlying hardware in the SAN for encryption, replication, and high availability. This is in contrast to HCI, where these abilities come from the software stack.
However, the time is approaching where those feature sets will begin to align, and then the differences between SAN and SAN-less products begin to reduce. And, taking advantage of existing hardware reaching end of life is a smart plan for enterprises, says Dell EMC. Non-mission critical applications can be moved to HCI platforms now, and the rest can come later.
This article was based on a February 12, 2018 ComputerWeekly article by Antony Adshead.
Get in Touch with PaperFree
1-888-726-7730 • firstname.lastname@example.org