You are here
Most of us ought to know the acronym LAMP, used to describe web services stacks made with Linux, the Apache web server, the MySQL database server, and either PHP, Perl, or Python.
But another web-stack acronym has come to prominence in the last few years: MEAN—a stack that uses the MongoDB NoSQL database, Express, Angular, and Node.js.[ The essentials from InfoWorld: NoSQL grudge match: MongoDB vs. Couchbase Server • Review: MongoDB learns cool new tricks • The essential guide to MongoDB security. • How to work with MongoDB in .Net. | Go deep into analytics and big data with the InfoWorld Big Data and Analytics Report newsletter. ]
Kevin Xu is general manager of global strategy and operations at PingCAP.
TiDB is an open-source, cloud-native, MySQL-compatible distributed database that handles hybrid transactional and analytical processing (HTAP) workloads. It is a member of the “NewSQL” class of relational databases that are designed to be deployed at massive scale. For those of you wondering, the “Ti” stands for Titanium.
PingCAP started building TiDB just three and a half years ago, but already the product has gathered upwards of 15,000 GitHub stars, 200 contributors, 7200 commits, 2000 forks, and 300 production users. Recently TiDB also collected InfoWorld’s 2018 Bossie Award as one of the best open source software projects in the data storage and analytics space.
GitHub has introduced a workflow tool called GitHub Actions to its popular code-sharing site, to allow continuous integration/continuous deployment (CI/CD) right from GitHub itself.
Using the tool, which is now in limited beta, developers can build, deploy, and update software projects on either GitHub or an external system without having to run code themselves. Workflows and infrastructure deployments can be expressed as code.[ The essentials from InfoWorld: Get started with CI/CD: Automating your application delivery with CI/CD pipelines. • 5 common pitfalls of CI/CD—and how to avoid them. | Keep up with hot topics in programming with InfoWorld’s App Dev Report newsletter. ]
Actions adds customizable workflow capabilities to GitHub.com, so developers build and share code containers to run a software development workflow, even across multiple clouds. Other examples of tasks that can be done with actions include packaging an NPM module or sending an SMS alert.
A poll of 250 IT decision makers across North America conducted by managed services provider Softchoice that polled found preparation for cloud initiatives is on track. 83 percent of those polled said they had assessed existing applications to determine if they were ready for the cloud, and 82 percent had modernized their data centers in preparation for cloud. Moreover, 72 percent internally communicated the business impact of a cloud strategy.[ IDG Research: The state of the cloud: How enterprise adoption is taking shape. | Keep up with the latest developments in cloud computing with InfoWorld’s Cloud Computing newsletter. ]
But there were some surprises in what companies discovered once they move to cloud:
Cloud services like Azure offer a lot of security features straight out of the box, especially if you’re using their platform services. But virtual infrastructures are much like physical infrastructures, connecting virtual machines with software-defined virtual networks. Thus, they need the same security and network management tools as your own data center and your own application infrastructures.
Two services are key to securing and managing Azure-hosted networks, focusing on different parts of the cloud journey.[ Defined: What is serverless computing. | Get started: A developer’s guide to serverless computing. | Build ’em now! 5 uses for serverless frameworks. Keep up with hot topics in programming with InfoWorld’s App Dev Report newsletter. ]
- The Azure Firewall is for your first application, for API and web-based code that’s important to your business but not critical.
- As applications and services grow, and as businesses move more and more code from on-premises to the cloud, your needs will change and you’ll need tools to help scale your services as well as securing them. To do that, Azure Front Doorcombines security and load-balancing features, using edge services to control and direct access to globally distributed applications.
There’s no conflict between these two services. Azure Firewall gets you started, and you can use it to build out an application until traditional routing and load-balancing techniques start to fail. That’s when you add Front Door to your architecture, adding a new layer above your existing networking tools. They can stay in place as a backup to Front Door, or they can be removed once you’re happy with how Front Door operates.
One of the last computing chores to be sucked into the cloud is data analysis. Perhaps it’s because scientists are naturally good at programming and so they enjoy having a machine on their desks. Or maybe it’s because the lab equipment is hooked up directly to the computer to record the data. Or perhaps it’s because the data sets can be so large that it’s time-consuming to move them.
Whatever the reasons, scientists and data analysts have embraced remote computing slowly, but they are coming around. Cloud-based tools for machine learning, artificial intelligence, and data analysis are growing. Some of the reasons are the same ones that drove interest in cloud-based document editing and email. Teams can log into a central repository from any machine and do the work in remote locations, on the road, or maybe even at the beach. The cloud handles backups and synchronization, simplifying everything for the group.
Microsoft is perhaps the most impressive company on the planet right now. While it doesn’t (currently) dominate markets like it used to, Microsoft has managed something dramatically more difficult, something that portends future success as a platform behemoth: profound cultural change.
Microsoft recently announced that it is effectively open-sourcing its 60,000-plus patent portfolio by joining the Open Invention Network. It claims this move will “help protect Linux and open source.”
Microsoft’s Entity Framework is an open-source object-relational mapper, or ORM, for ADO.Net that helps you isolate the object model of your application from the data model. Entity Framework simplifies data access in your application by allowing you to write code to perform CRUD (Create, Read, Update, and Delete) operations without having to know how the data is persisted in the underlying database.
The DbContext acts as a bridge between the domain classes and the database. In this article we will examine how we can configure the DbContext using an instance of DbContextOptions to connect to a database and perform CRUD operations using the Entity Framework Core provider.[ Getting to know React? Don’t miss InfoWorld’s tutorial on getting started with React. | Keep up with hot topics in programming with InfoWorld’s App Dev Report newsletter. ]DbContext explained
The DbContext is an integral component of the Entity Framework that represents a connection session with the database. You can take advantage of the DbContext to query data into your entities or save your entities to the underlying database. The DbContext in Entity Framework Core has a number of responsibilities:
Not to name names, but I’ve been reading in several publications that one of the main reasons to go to multicloud is to avoid vendor lockin. While I can see the logic behind this assumption—that having more cloud providers means you can be more independent—the reality is much different.
For example, if you have an application in the cloud, and you’re using a multicloud architecture, you’ll have two or three choices where to place that application workload and associated data: Amazon Web Services, Microsoft Azure, and/or Google Cloud Platform.
Some R users become leery of graphical user interfaces. Pointing and clicking and dragging may be convenient, but it can be harder to save, check, or rerun an analysis.
But I think even most hardcore command-line junkies would agree that a drag-and-drop interface can be helpful for some exploratory data visualization.[ Get Sharon Machlis’s R tips in our how-to video series. | Read the InfoWorld tutorials: Learn to crunch big data with R. • How to reshape data in R. • R data manipulation tricks at your fingertips • Beginner’s guide to R. | Stay up to date on analytics and big data with the InfoWorld Big Data Report newsletter. ]
That’s what the new R package esquisse brings to ggplot2. It gives the best of both worlds: drag-and-drop, plus generating basic ggplot code for the graphs you create. And, it’s pretty cool! esquisse was created by two people at a French R consulting firm, DreamRs. The name esquisse is French for sketch.
Angular provides dependency injection, particularly useful for assembling data services for applications, along with use of an HTML template to compose components. In Angular, developers still compose components with an HTML component that connects to TypeScript code for imperative parts of the program.
Puppet is developing applications to provide operational insights and vulnerability assessments for devops. Both the insights and vulnerability tools are in private beta.[ Devops best practices: The 5 methods you should adopt. • How to align test automation with agile and devops. • InfoWorld explains monitoring in the age of devops. • What is devops, exactly? Discover how to transform software development. ] Devops insights tool
The Puppet Insights tool, which arose from Puppet’s June 2018 acquisition of data visualization company Reflect, is intended to measure the impacts of devops investments by aggregating and analyzing data across the tool chain, with visibility into the software delivery pipeline. Dashboards help identify the velocity, quality, and impact of software delivery teams and processes. Aspects such as software defect rates are measured in evaluating a software delivery project.
Kasun Indrasiri is the director of integration architecture at WSO2.
Increasingly, developers rely on a microservices architecture to build an application as a suite of fine-grained, narrowly focused, and independent services, each of which is developed and deployed independently. Despite the agility fostered by the microservices approach, it also brings new challenges, since these services have to interact with each other and with other systems, such as web APIs and databases, via network calls. And because the network is always an unreliable factor, such interactions are susceptible to failure at any time.
A book published in 1981, called Nailing Jelly to a Tree, describes software as “nebulous and difficult to get a firm grip on.” That was true in 1981, and it is no less true nearly four decades since. Software, whether it is an application you bought or one that you built yourself, remains hard to deploy, hard to manage, and hard to run.
Docker containers provide a way to get a grip on software. You can use Docker to wrap up an application in such a way that its deployment and runtime issues—how to expose it on a network, how to manage its use of storage and memory and I/O, how to control access permissions—are handled outside of the application itself, and in a way that is consistent across all “containerized” apps. You can run your Docker container on any OS-compatible host (Linux or Windows) that has the Docker runtime installed.
The process of learning in general often means making mistakes and taking the wrong paths, and then figuring out how to avoid these pitfalls in the future. Machine learning is no different.
As you implement machine learning in your enterprise, be careful: Some of technology marketing might suggest that the learning is very right very fast, an unrealistic expectation for the technology. But the truth is that there are bound to be mistakes in the machine learning process. And these mistakes can get encoded, at least for a while, in business processes. The result: Those mistakes now happen at scale and often outside immediate human control.
Microsoft has released through open source its Infer.Net cross-platform framework for model-based machine learning.
Infer.Net will become part of the ML.Net machine learning framework for .Net developers, with Infer.Net extending ML.Net for statistical modeling and online learning. Several steps toward integration already have been taken, including the setting up of a repo under the .Net Foundation.T[ Go deep into machine learning at InfoWorld: 11 must-have machine learning tools. • 13 frameworks for mastering machine learning • Machine learning pipelines demystified • Review: 6 machine learning clouds • Which Spark machine learning API should you use? ]
Microsoft cited the applicability of Infer.Net to three use cases:
Microsoft has released the 0.6 version of its ML.Net machine learning framework, aimed at .Net developers. The update adds a new and more useful model-building API set, the ability to use more existing models to provide predictions, and better performance overall.
The original ML.Net API limited the kinds of pipelines you could build and had some clumsy restrictions on labeling and scoring data. The new API more flexibly allows training and prediction processes to be made up of multiple components that can be joined together in a variety of combinations, instead of requiring a single linear pipeline. The goal is to emulate the design of APIs used to drive other frameworks like Apache Spark.
Swift, Apple’s language for MacOS and iOS development, is challenging for a permanent position in the Top 10 of the Tiobe index of programming language popularity, but it has competition from Google’s Go language (Golang) and the R language. All three languages, however, face obstacles to their ascendance in the index.
The four-year-old Swift was in tenth place in the index in October 2018. Swift has ranked this high before but it never scaled above tenth place. Swift's inabiity to build Google Android applications has had developers instead using cross-platform frameworks, capping Swift's rach, software quality services vendor Tiobe said.