Improved Data Security with MySQL Privileges and DreamFactory

DreamFactory and MySQL

All MySQL installations naturally include a root account and offer the ability to create restricted user accounts. However, otherwise sane developers will often use these root accounts for application-level communication, dramatically raising the likelihood of data theft, data exfiltration, and other security issues. For that reason the DreamFactory team always recommends users take care to create restricted MySQL users before using the platform to generate APIs.

In this tutorial, you’ll learn how to create a non-root MySQL user and then further restrict this user’s privileges to a specific database and even table subset. You’ll also learn how to subsequently revoke a user’s privileges to reflect changing requirements.

Continue reading “Improved Data Security with MySQL Privileges and DreamFactory”

Learning About The Bitnami System Database

Database Code Lines

The Elusive Bitnami System Database

If you want to spin up a fast API solution, DreamFactory is a great way to do that with a Bitnami install. Within minutes you can have a fully documented and secure REST API to utilize. Just like any program bundle, there are lots of features to learn and interact with.  Outside of a Docker Swarm or AWS ELB setup, it is pretty hard to find a way to spin up a DreamFactory instance faster. We are going to dive in a bit further to find out how to interact with the system database. Continue reading “Learning About The Bitnami System Database”

AWS Redshift – SQL Functionality on Planet-Scale Hardware

The Problem

Your manager’s peers have been bragging a lot lately about their data warehouses, analytics, and charts, and now a steady stream of data-related questions are being sent your way.  Your department maintains several databases, and the data they contain has the potential to answer everything management is asking for. But the databases are needed for day-to-day operations, and can’t scale to answer these often highly specific questions such as, “How many asparaguses were consumed by men named Fonzie in Cleveland on Tuesdays in 2013?”. How to unlock the potential of this data?

You’ve probably heard of data warehouses, which are tailor-made for this sort of witchcraft. They make it possible to unlock every bit of value from data, and find answers wickedly fast. In the past, creating and maintaining data warehouses meant large, ongoing investments in hardware, software, and people to run them. This would be a hard sell – isn’t the company already spending enough?! Good news, however! In this day of cloud computing, it’s incredibly simple to create, load, and query data warehouses. They typically charge on a usage basis, meaning you don’t need the initial upfront capital investment to get off the ground. And they are super fast – far more powerful than anything you could run in-house.

This post will focus on Amazon Web Services Redshift (Amazon Web Services = AWS). And as a bonus, I’ll demonstrate the incredible Dreamfactory, which automatically builds a slick REST API interface over the top. From there, you’re a GUI away from giving management everything they could ask for, and wowing them with extras they hadn’t even thought of. They can now stand tall amongst their fellow executives, knowing you have their back.

AWS Redshift

AWS Redshift is built upon PostgreSQL, but has been dramatically enhanced to run at “cloud scale” within AWS. There are a few ingredients to this secret sauce:

Column-oriented storage

While you don’t need a deep understanding of what’s happening under the hood to use it, Redshift employs a fascinating approach to achieve it’s mind-boggling performance. Let’s say you have data that looks like the following:
ID NAME CREATED DESCRIPTION AMOUNT

1 Harold 2018/01/01 Membership 10.00

2 Susan 2017/11/15 Penalty 5.00

3 Thomas 2016/10/01 Membership 8.00
Most SQL databases you’ve probably used in the past are row-based, which means they store their data something like this:
1:Harold,2018/01/01,Membership,10.00;

2:Susan,2017/11/15,Penalty,5.00;

3:Thomas,2016/10/01,Membership,8.00;
This is the efficient way to maximize storage, and works well for retrieving data in the “traditional fashion” (rows at a time). But when you want to slice and dice this data, it doesn’t scale very well. If you’ve got large (business-scale) volumes of data, and a variety of ways you want to query it, you can really start to strain your database. Column-based databases, on the other hand, flip this idea on its head, and store the information in a column-based format, with the *data* serving as the *key*. So the above might look something like this:
Harold:1;Susan:2;Thomas:3;

2018/01/01:1;2017/11/15:2;2016/10/01:3;

Membership:1,3;Penalty:2;

10.00:1;5.00:2;8.00:3;
This drastically improves query performance. For example, when searching for “DESCRIPTION == ‘Membership'”, the query only needs to make one database call (“give me the items with a ‘DESCRIPTION’ of ‘Membership'”), instead of inspecting each row individually (as it would have to do in a traditional, row-based database). Very cool, very fast!

Massive Parallelization

When I picture what the AWS cloud must look like, I usually conjure something up from the Matrix (except it’s full of regular computers, rather than, well, humans). Or maybe Star Trek’s “Borg”, a ridiculous planet-cube flying through space, sucking up other civilizations. I guess both of those images are a little disturbing. A safer mental image is this – data centers spanning the globe, loaded with racks and racks of computers, all connected and working together. For most computing tasks, throwing more hardware at the problem doesn’t automatically increase performance. There are bottlenecks that remain in place no matter how many processors are churning away. In our “traditional database” example, this bottleneck is typically disk I/O – the processors are all trying to grab data from the same place. To overcome this, the architecture and storage have to be arranged in a way that can benefit from parallelization. Which is exactly the case with AWS Redshift. Due to the column-based design described above, Redshift is able to take full advantage of adding processors, and it’s almost linearly scalable. This means if you double the number of computers (“nodes”, in Redshift-speak), the performance doubles. And so on. Combine this scalability with the ridiculous number of computers AWS has at it’s disposal (specifically, several Borgs-worth), and it’s like staring out at a starry night. It goes on forever in all directions.

How this works for you

If you’re sold on the power of AWS Redshift, then you’ll be pleased to learn that setup is incredibly simple. AWS documentation is top notch, a crucial thing in this brave new world. When writing this post, I followed their tutorial, and it all went smoothly. Probably took me 15 minutes, and I had the example up and running. If you already have SQL expertise, you won’t have any problem picking up Redshift syntax. There are some differences and nuances, but the standard “things” (joins, where clauses, etc) all work as expected. I typically use Microsoft’s SQL Server Management Studio (SSMS), and was able to connect to Redshift with no problem (after setting it up as a linked server). Your favorite SQL client will presumably work here as well (anything that supports JDBC or ODBC drivers). Once you get your feet wet, there are myriad tools that will load your business data into Redshift. If you’ve got SQL chops in house, I’d start with the AWS documentation, and go from there. If you need a little (or a lot) of help, a whole ecosystem of companies and tools have sprung up around Redshift. A quick Google search will introduce you to them. When you’re up and running, and growing more comfortable demanding more from the system, AWS makes it incredibly simple to add capacity. Thanks to the brilliant Redshift architecture, you just add nodes, and AWS takes care of the rest. Their billing dashboard will show you what it’s costing in real time, with no hidden or creeping costs of data centers, hardware upgrades, things going bump in the night, etc. So much magic happening under the covers, and you get the credit. The joys of cloud computing!

My Humble Example

When writing this, I used the example AWS provides (it consists of a few tables containing some fake Sales data). With everything in place, I can query from SSMS (with a little bit of “linked server” glue syntax):
exec ('-- Find total sales on a given calendar date.

SELECT sum(qtysold)

FROM sales, date

WHERE sales.dateid = date.dateid

AND caldate = ''2008-01-05'';') at redshift

sum

--------------------

210

(1 row affected)
I get a thrill when a chain of systems, architectures, and networks all flow together nicely. Somewhere in a behemoth of a data center, a processor heard my cry, and spun out this result in response. Amazing.

DreamFactory

Now that the company has access to the data, and can gleefully ask any question, they are going to want the dashboards and pretty graphs. Typically you’d use a REST API to feed the data to some sort of UI, but how to do this with Redshift? While management is tickled with their new toy, they will cloud over with suspicion if you now propose a months-long project to make it shinier. In keeping with the theme of “easy, automatic, and powerful”, I’d propose using DreamFactory. In a matter of minutes (literally), it will connect to a data store (both SQL or NoSQL), intelligently parse all the schema, and spin up a REST API layer for doing all the things (complete with attractive documentation). What used to take a team of developers months can now happen in an afternoon! Here are some screenshots of my REST API, completely auto generated from the Redshift example above. It took me about 15 minutes (12 of those spent poking around the documentation) to get this done. For my simple example, I followed their Docker instructions, and in no time was playing with the REST API depicted below:
let’s get our rest on!
 
what pretty documentation you have!
  Powerful stuff!

To Infinity and Beyond!

Now that you’ve witnessed how easily you can warehouse all your data, and bootstrap it into a REST API, it’s time to bring this to your organization. Play with it a little, get comfortable with the tools, then turn up the dials.   Want to learn more about how DreamFactory and Redshift can work together (or how to put a REST API on any database)?  Schedule a demo with us. The next time management comes calling for data, you can give it to them with a fire hose!

Microsoft Server 2012 R2, SQL Server 2016 and DreamFactory – A Match Made in Heaven

Part 1: Running Microsoft Server 2012 and SQL Server on AWS, on my MacBook Pro

How do we get from here, hosting an AWS Microsoft Server instance on my MacBook Pro?
AWS Microsoft Server 2012 R2 Desktop
AWS Microsoft Server 2012 R2 Desktop
To here using Microsoft Server, SQL Server, and Dreamfactory, still on my MacBook Pro.
SQL Server Get Schema
SQL Server Get Schema

Some Background:

Let’s get to the nuts and bolts of this.  In the past, it was very difficult to cross over platforms and create Microsoft based solutions or Linux based solutions on the other’s platform.  With the advent of cloud computing, this has become increasingly easier to do. When you have a robust piece of middleware software, such as DreamFactory which is for most intents and purposes language and platform agnostic, you really do have your choice of platforms to install it on.  Each has its advantages and disadvantages, which I am not going to go into detail in this article, but suffice it to say, there are a lot of enterprises that choose the Microsoft platform(s), and some of those advantages became apparent as I worked on this post. First things first, make sure to grab all of the pre-requisites you need to make the install easy:

Required Software and Extensions

At a minimum, you will need the following software and extensions installed and enabled on your system in order to successfully clone and install DreamFactory 2.12.0+.
  • PHP 7+ – check and install the requirements below for your particular environment.
    • PHP required extensions: Curl, MBString, MongoDB, SQLite, and Zip. You may need to install other extensions depending upon DreamFactory usage requirements. If you don’t plan on using MongoDB, please remove the df-mongodb requirement from,composer.json or include the --ignore-platform-req option when running composer install.
  • Git
  •  Windows Git Client – Git Bash lets you run “Linux style” commands
  • A web server such as NGINX, Apache, or IIS. You may use PHP’s built-in server for development purposes.
  • One of four databases for storing configuration data: MS SQL Server, MySQL (MariaDB or Percona are also supported), PostgreSQL, or SQLite.
  • Composer – may require cURL to be installed from particular environment below.
Microsoft Server can be spun up almost anywhere now, as is evidenced by the photos above, and since DreamFactory is platform agnostic, we can install it on the Microsoft Server 2012 R2 instance with just a few bits of software installed to get up and running. There are multiple ways to grab and install PHP on a Microsoft platform, but an easy way is to utilize the Web Platform Installer (version 5.0 as of this post).

The Install:

You can download the Web Platform Installer for IIS here. Select a PHP version (7.0.x is required to run the current 2.13.0 version of DreamFactory), and different pieces of IIS, should you decide to utilize that as your production web server.  This post will not dive into the nitty-gritty of IIS, but you can see our documentation here.  We will be using PHP’s built-in development web server to just illustrate the connections.
Web Platform Installer 5.0
Web Platform Installer 5.0 Showing PHP installed
Once you have installed PHP and double checked your pre-requisites are installed, you can begin the install:
  • Perform a Git clone into this directory for Dreamfactory:
git clone https://github.com/dreamfactorysoftware/dreamfactory
Git Clone DeramFactory
Clone down the latest version
This will pull down the master branch of Dreamfactory into a directory called ./dreamfactory.
  • Navigate to the dreamfactory directory and install dependencies using composer. For production environment, use --no-dev, otherwise discard that option for a development environment. If you are not running or plan to run MongoDB, add —ignore-platform-reqs:
composer update --ignore-platform-reqs --no-dev
composer update --ignore-platform-reqs --no-dev
composer update –ignore-platform-reqs –no-dev
Otherwise, run the following command to install the dependencies:
composer install --no-dev
  • Run DreamFactory setup command-line wizard. This will set up your configuration and prompt you for things like database settings, first admin user account, etc. It will also allow you to change environment settings midway and then run it again to complete the setup.
php artisan df:setup
DF:Setup
php artisan df:setup
Follow the on-screen prompts to complete the setup.
Prompts
Follow the prompts
You can then run php artisan serve and migrate over to the address and port you have set up. In this example, we are running off of http://127.0.0.1:8000
php artisan serve
php artisan serve

Part 2:  The SQL Server Reckoning

With our instance running now, we can finally delve into the “fun” part of this install.  The ease with which you can add a SQL Server instance is awesome.  It is the fastest install I have ever done from the driver install to DreamFactory connection, it was less than 5 minutes¹. Using our trusty Web Platform Installer friend, you can download a SQL Server driver package that is compatible with your PHP version and your O/S version.
SQL Server Driver Package, version 5.2
SQL Server Driver Package, version 5.2
Now you can head back over to your instance and create a SQL Server service.  Just select the service type, add in your credentials and then test it.  That’s it.  No muss, no fuss.  Take a look at the screenshots below to see the results.
Create your service
Create your service
Add your credentials
Add your credentials
SQL Server Get Schema
SQL Server Get Schema
We have now connected our SQL Server instance to our Microsoft Server 2012 R2 (both hosted on AWS) on my MacBook Pro.  Sometimes, it all falls into place.  Don’t forget to check out our wiki and community forums for more topics, information, and examples.
¹ I had my credentials on hand in a notepad text file for copy/paste quickness, but still, very fast 🙂

Filtering Related Columns within DreamFactory REST API Queries

Consider a query which joins employee records found in an employees table with information about their assigned department, the latter of which resides in a table named departments. The relationship is formalized using a key named emp_no. When DreamFactory parses the schema it will create aliases for each relationship, including one for the above-described named something like dept_emp_by_emp_no. The join query will therefore look like this:
/api/v2/mysql/_table/employees?related=dept_emp_by_emp_no
This would yield a JSON response containing records that look like this:
{
  "emp_no": 10001,
  "birth_date": "1953-09-02",
  "first_name": "Georgi",
  "last_name": "Facello",
  "gender": "M",
  "hire_date": "1986-06-26",
  "birth_year": "1953",
  "dept_emp_by_emp_no": [
    {
      "emp_no": 10001,
      "dept_no": "d005",
      "from_date": "1986-06-26",
      "to_date": "9999-01-01"
    }
  ]
},
If you wanted to limit the related fields to just dept_no and from_date, you would add dept_emp_by_emp_no.fields to the parameter list:
/api/v2/mysql/_table/employees?related=dept_emp_by_emp_no&dept_emp_by_emp_no.fields=dept_no,from_date
This query would yield records with the following structure:
{
  "emp_no": 10001,
  "birth_date": "1953-09-02",
  "first_name": "Georgi",
  "last_name": "Facello",
  "gender": "M",
  "hire_date": "1986-06-26",
  "birth_year": "1953",
  "dept_emp_by_emp_no": [
    {
      "dept_no": "d005",
      "from_date": "1986-06-26"
    }
  ]
},
You can learn more about working with related data inside DreamFactory on our wiki: http://wiki.dreamfactory.com/DreamFactory/Features/Database/Related_Data#Getting_the_Related_Data.

SQL DB REST APIs in Minutes, not Months

Have you got SQL data that you need to access from your mobile, web or IOT apps?

If so, DreamFactory provides an easy and secure way to add a REST API to any SQL database in minutes, and supports 18 popular databases, among them MS SQL Server, Oracle, MySQL, IBM DB2, Postgres, SAP SQL Anywhere, SAP Hana, MemSQL and MongoDB! All you have to do is use the [DreamFactory][2] REST API backend to connect your database, then use it to auto-generated a REST API for your database – it’s that simple!

In this blog post we’ll show how to REST-enable any SQL database, which is free forever for the databases and other services covered by our open source software. Then we’ll show some simple examples of how to use the REST API to manage your SQL schema and data.

Do you need to create a REST API for MS SQL Server, Oracle, MongoDB, or any other database? Using DreamFactory, you can be up and running in minutes rather than months! Request a demo with one of our engineers and we’ll be happy to show you how it’s done for your particular use case! If you’re a video kind of person, we have some screencasts available. If you haven’t already checked out our free open source software, you can download it here

Continue reading “SQL DB REST APIs in Minutes, not Months”

Create a MySQL REST API in Minutes Using DreamFactory

Karl Hughes recently penned a blog post titled “The Bulk of Software Engineering in 2018 is Just Plumbing“. Notably he stated, “Just like plumbers, we are paid to know our tools and understand how they work together to make a usable piece of equipment, not to reinvent working technology…”. As programmers we should not be bothered with repeatedly writing code which is otherwise readily available, robust, and well-tested. Yet this problem remains persistent in the REST API space, despite the implementation process being by this point in time rote, repetitive, and prone to error and oversight. This oversight is costly for several reasons:
  • End users just *do not care* how the API was implemented, meaning there is no competitive advantage to be had by hand-crafting a new API for each project.
  • Error and oversight in the API implementation and deployment phase can come at a very steep price due to security lapses and performance issues.
  • Repeatedly building one-off APIs means they can’t be managed via a single platform or interface; unless the team decides to devote even more time and effort to building a custom management solution.
Fortunately, the DreamFactory platform can easily absolve your team from all of these hassles and much more by offering a centralized solution for the API generation, documentation, and security. In this tutorial I’ll show you just how easy it is to build, secure, and deploy a REST API for your MySQL database.

Follow Along!

DreamFactory’s MySQL service connector is part of our open source version. You can download an installer or clone directly from GitHub via our downloads page.

Generating the MySQL REST API

DreamFactory can generate REST APIs for 18 databases, among them MySQL, Microsoft SQL Server, Oracle, PostgreSQL, and MongoDB. To do so, you’ll login to the DreamFactory administration interface, navigate to Services and then enter the service creation interface by clicking on the Create button located to the left of the screen. From there you’ll select the MySQL service type by navigating to Database > MySQL (see below screenshot).   Next you’ll be prompted to provide a name, label, and description (below screenshot). The latter two are used just for reference purposes within the administration interface, however the name value is particularly important because as you’ll soon see it will comprise part of the API URL.   Finally, click on the Config tab. Here you’ll be prompted to provide the database connection credentials (see below screenshot). This should really be nothing new; you’ll supply a host name, username, password, and database. Additionally, you can optionally specify other configuration characteristics such as driver options, the timezone, and caching preferences. For the purpose of this tutorial I’ll stick to the required fields and leave the optional features untouched.   With the credentials in place, just press the Save button at the bottom of the screen, and believe it or not the REST API has been generated!

Viewing the Swagger Documentation

Along with the API, DreamFactory will also auto-generate an extensive set of interactive Swagger documentation. You can access it by clicking on the API Docs tab located at the top of the administration interface, and then selecting the newly generated service by name. You’ll be presented with 44 endpoints useful for executing stored procedures, carrying out CRUD operations, querying views, and much more. For instance the following screenshot presents just a small subset of newly generated MySQL REST API endpoints!  

Creating a Role and API Key

All DreamFactory-generated APIs are automatically protected by (at minimum) an API key. You can optionally authenticate users using basic authentication, SSO, or Directory Services (LDAP and Active Directory). Furthermore, you can associate each API key and/or user with a *role* which determines exactly what services the user is allowed to access. Not only that, you can restrict interactions to a specific database table or set of tables, a specific endpoint(s), and even restrict which HTTP methods are allowed. As an example, let’s create a new role which restricts the associated API key to interacting with a single table in a read-only fashion within the newly created MySQL API. To do so, navigate to the Roles tab, and click the Create button. You’ll be presented with the interface found in the below screenshot. In the screenshot you’ll see I’ve already assigned a name and description for the role, and made it active by selecting the Active checkbox.   Next, click the Access tab. This is where you’ll define what the role can do. In the below screenshot you’ll see I’ve limited the role to interacting with the MySQL service, and within that service the role can only interact with the _table/employees* endpoint via the GET method. We’re on lockdown baby!   Save the role by clicking the Save button. Now we’ll create a new API key and associate the key with this role. To do so, click on the Apps tab located at the top of the screen, and then click the Create button. Assign your new App a name and description, ensure it is set to Active, and then assign it the default role of MySQL just as I’ve done in the below screenshot. Regarding the App Location setting, presuming you plan on interacting with the API via a web or mobile application, or via another web service, then you’ll want to select “No storage required”.   Press the Save button and you’ll be returned to the Apps index screen where the new API key can be copied! Copy the key into a text file for later reference.

Configuring CORS

We have one final configuration step before being able to test the API from outside the DreamFactory administration interface. You’ll need to enable CORS (Cross-Origin Resource Sharing) for the new API. For purposes of demonstration, you can set the default CORS setting as I’ve done in the below screenshot, which will allow API-restricted traffic from all network addresses:  

Testing the REST API

With the API generated, API key and associated role created, and CORS configured, you’re ready to begin interacting with the API via a client! I like to use Insomnia for HTTP testing on MacOS, however another popular solution is Postman. In the following screenshot I’m using Insomnia to contact the /api/v2/_table/employees endpoint using a GET request.   Recall that we’ve locked down this API key to only interact with the /api/v2/_table/employees/* endpoints using the GET method. So what happens if we try to POST to this table? A 401 (Unauthorized) status code is returned, as depicted in the following screenshot:   Where to From Here? Believe it or not, we’ve only scratched the surface in terms of what DreamFactory can do for you. If you’d like to see our SQL Server, Oracle, or MongoDB connectors in action, or would like to watch how easy it is to convert a SOAP service to REST without writing any code, why not schedule a demo with our engineering team! Head over to https://www.dreamfactory.com/products and schedule a demo today!

Connecting MySQL with JavaScript;  DreamFactory as a BaaS

The DreamFactory REST API enables database connections using a wide variety of front end scenarios. This simple sample app demonstrates how DreamFactory easily can be used as a backend for a JavaScript application. It’s a simple address book, where contacts can be created, shown, updated, deleted and grouped: basically, CRUD operations.

Continue reading “Connecting MySQL with JavaScript;  DreamFactory as a BaaS”

Turn CSV files into REST APIs with DreamFactory’s Data Importer

DreamFactory 2.7 introduces a new system endpoint – api/v2/system/import that allows you to import data files using a database service of your choice. Currently the feature supports CSV files. Support for XML and JSON is on our roadmap for future releases. This new endpoint is a DreamFactory native endpoint and it is a part of the “system” service. This endpoint is available to use right out of the box without the need for installing additional driver/extensions.

Continue reading “Turn CSV files into REST APIs with DreamFactory’s Data Importer”