Subscribe to Drupal News feed
Drupal.org - aggregated feeds in category Planet Drupal
Updated: 1 day 2 hours ago

Wim Leers: State of JSON API (July 2018)

Tue, 07/10/2018 - 06:01

Quite a few people in the Drupal community are looking forward to see the JSON API module ship with Drupal 8 core.

Because:

  • they want to use it on their projects
  • the Admin UI & JS Modernization Initiative needs it
  • they want to see Drupal 8 ship with a more capable RESTful HTTP API
  • then Drupal will have a non-NIH (Not Invented Here) API but one that follows a widely used spec
  • it enables them to build progressively decoupled components

So where are things at?

Timeline

Let’s start with a high-level timeline:

  1. The plan (intent) to move the JSON API module into Drupal core was approved by Drupal’s product managers and a framework manager 4 months ago, on March 19, 2018!
  2. A core patch was posted on March 29 (issue #2843147). My colleague Gabe and I had already been working full time for a few months at that point to make the JSON API modules more stable: several security releases, much test coverage and so on.
  3. Some reviews followed, but mostly the issue (#2843147) just sat there. Anybody was free to provide feedback. We encouraged people to review, test and criticize the JSON API contrib module. People did: another 1000 sites started using JSON API! Rather than commenting on the core issue, they filed issues against the JSON API contrib module!
  4. Since December 2017, Gabe and I were still working on it full time, and e0ipso whenever his day job/free time allowed. Thanks to the test coverage Gabe and I had been adding, bugs were being fixed much faster than new ones were reported — and more often than not we found (long-existing) bugs before they were reported.
  5. Then 1.5 week ago, on June 28, we released JSON API 1.22, the final JSON API 1.x release. That same day, we branched the 2.x version. More about that below.
  6. The next day, on June 29, an updated core patch was posted. All feedback had been addressed!
June 29

I wrote in my comment:

Time to get this going again. Since #55, here’s what happened:

  1. Latest release at #55: JSON API 1.14
  2. Latest release today: JSON API 1.22
  3. 69 commits: ($ git log --oneline --since "March 30 2018 14:21 CET" | wc -l)
  4. Comprehensive test coverage completed (#2953318: Comprehensive JSON API integration test coverage phase 4: collections, filtering and sorting + #2953321: Comprehensive JSON API integration test coverage phase 5: nested includes and sparse field sets + #2972808: Comprehensive JSON API integration test coverage phase 6: POST/PATCH/DELETE of relationships)
  5. Getting the test coverage to that point revealed some security vulnerabilities (1.16), and many before it (1.14, 1.10 …)
  6. Ported many of the core REST improvements in the past 1.5 years to JSON API (1.15)
  7. Many, many, many bugfixes, and much, much clean-up for future maintainability (1.16, 1.17, 1.18, 1.19, 1.20, 1.21, 1.22)

That’s a lot, isn’t it? :)

But there’s more! All of the above happened on the 8.x-1.x branch. As described in #2952293: Branch next major: version 2, requiring Drupal core >=8.5 (and mentioned in #61), we have many reasons to start a 8.x-2.x branch. (That branch was created months ago, but we kept them identical for months.)
Why wait so long? Because we wanted all >6000 JSON API users to be able to gently migrate from JSON API 1.x (on Drupal ⇐8.5) to JSON API 2.x (on Drupal >=8.5). And what better way to do that than to write comprehensive test coverage, and fixing all known problems that that surfaced? That’s what we’ve been doing the past few months! This massively reduces the risk of adding JSON API to Drupal core. We outlined a plan of must-have issues before going into Drupal core: #2931785: The path for JSON API to core — and they’re all DONE as of today! Dozens of bugs have been flushed out and fixed before they ever entered core. Important: in the past 6–8 weeks we’ve noticed a steep drop in the number of bug reports and support requests that have been filed against the JSON API module!

After having been tasked with maturing core’s REST API, and finding the less-than-great state that was in when Drupal 8 shipped, and having experienced how hard it is to improve it or even just fix bugs, this was a hard requirement for me. I hope it gives core committers the same feeling of relief as it gives me, to see that JSON API will on day one be in much better shape.

The other reason why it’s in much better shape, is that the JSON API module now has no API surface other than the HTTP API! No PHP API (its sole API was dropped in the 2.x branch: #2982210: Move EntityToJsonApi service to JSON API Extras) at all, only the HTTP API as specified by http://jsonapi.org/format/.

TL;DR: JSON API in contrib today is more stable, more reliable, more feature-rich than core’s REST API. And it does so while strongly complying with the JSON API spec: it’s far less of a Drupalism than core’s REST API.

So, with pride, and with lots of sweat (no blood and no tears fortunately), @gabesullice, @e0ipso and I present you this massively improved core patch!

EDIT: P.S.: 668K bytes of the 1.0M of bytes that this patch contains are for test coverage. That’s 2/3rds!

To which e0ipso replied:

So, with pride, and with lots of sweat (no blood and no tears fortunately), @gabesullice, @e0ipso and I present you this massively improved core patch! So much pride! This was a long journey, that I walked (almost) alone for a couple of years. Then @Wim Leers and @gabesullice joined and carried this to the finish line. Such a beautiful collaboration!

:)

July 9

Then, about 12 hours ago, core release manager xjm and core framework manager effulgentsia posted a comment:

(@effulgentsia and @xjm co-authored this comment.) It’s really awesome to see the progress here on JSON API! @xjm and @effulgentsia discussed this with other core committers (@webchick, @Dries, @larowlan, @catch) and with the JSON API module maintainers. Based on what we learned in these discussions, we’ve decided to target this issue for an early feature in 8.7 rather than 8.6. Therefore, we will will set it 8.7 in a few days when we branch 8.7. Reviews and comments are still welcome in the meantime, whether in this issue, or as individual issues in the jsonapi issue queue. Feel free to stop reading this comment here, or continue reading if you want to know why it’s being bumped to 8.7. First, we want to give a huge applause for everything that everyone working on the jsonapi contrib module has done. In the last 3-4 months alone (since 8.5.0 was released and #44 was written):
  • Over 100 issues in the contrib project have been closed.
  • There are currently only 36 open issues, only 7 of which are bug reports.
  • Per #62, the remaining bug fixes require breaking backwards compatibility for users of the 1.x module, so a final 1.x release has been released, and new features and BC-breaking bug fixes are now happening in the 2.x branch.
  • Also per #62, an amazing amount of test coverage has been written and correspondingly there’s been a drop in new bug reports and support requests getting filed.
  • The module is now extremely well-documented, both in the API documentation and in the Drupal.org handbook.
Given all of the above, why not commit #70 to core now, prior to 8.6 alpha? Well,
  1. We generally prefer to commit significant new core features early in the release cycle for the minor, rather than toward the end. This means that this month and the next couple are the best time to commit 8.7.x features.
  2. To minimize the disruption to contrib, API consumers, and sites of moving a stable module from core to contrib, we’d like to have it as a stable module in 8.7.0, rather than an experimental module in 8.6.0.
  3. Per above, we’re not yet done breaking BC. The mentioned spec compliance issues still need more work.
  4. While we’re still potentially evolving the API, it’s helpful to continue having the module in contrib for faster iteration and feedback.
  5. Since the 2.x branch of JSON API was just branched, there are virtually no sites using it yet (only 23 as compared with the 6000 using 1.x). An alpha release of JSON API 2.x once we’re ready will give us some quick real-world testing of the final API that we’re targeting for core.
  6. As @lauriii pointed out, an additional advantge of allowing a bit more time for API changes is that it allows more time for the Javascript Modernization Initiative, which depends on JSON API, to help validate that JSON API includes everything we need to have a fully decoupled admin frontend within Drupal core itself. (We wouldn’t block the module addition on the other initiative, but it’s an added bonus given the other reasons to target 8.7.)
  7. While the module has reached maturity in contrib, we still need the final reviews and signoffs for the core patch. Given the quality of the contrib module this should go well, but it is a 1 MB patch (with 668K of tests, but that still means 300K+ of code to review.) :) We want to give our review of this code the attention it deserves.
None of the above aside from the last point are hard blockers to adding an experimental module to core. Users who prefer the stability of the 1.x module could continue to use it from contrib, thereby overriding the one in core. However, in the case of jsonapi, I think there’s something odd about telling site builders to experiment with the one in core, but if they want to use it in production, to downgrade to the one in contrib. I think that people who are actually interested in using jsonapi on their sites would be better off going to the contrib project page and making an explicit 1.x or 2.x decision from there. Meanwhile, we see what issues, if any, people run into when upgrading from 1.x to 2.x. When we’re ready to commit it to core, we’ll consider it at least beta stability (rather than alpha). Once again, really fantastic work here. Next

So there you have it. JSON API will not be shipping in Drupal 8.6 this fall.
The primary reason being that it’s preferred for significant new core features to land early in the release cycle, especially ones shipping as stable from the start. This also gives the Admin UI & JS Modernization Initiative more time to actually exercise many parts of JSON API’s capabilities, and in doing so validate that it’s sufficiently capable to power it.

For us as JSON API module maintainers, it keeps things easier for a little while longer: once it’s in core, it’ll be harder to iterate: more process, slower test runs, commits can only happen by core committers and not by JSON API maintainers. Ideally, we’d commit JSON API to Drupal core with zero remaining bugs and tasks, with only feature requests being left. Good news: we’re almost there already: most open issues are feature requests!

For you as JSON API users, not much changes. Just keep using https://www.drupal.org/project/jsonapi. The 2.x branch introduced some breaking changes to better comply with the JSON API spec, and also received a few new small features. But we worked hard to make sure that disruption is minimal (example 1 2 3).1
Use it, try to break it, report bugs. I’m confident you’ll have to try hard to find bugs … and yes, that’s a challenge to y’all!

  1. If you want to stay on 1.x, you can — and it’s rock solid thanks to the test coverage we added. That’s the reason we waited so long to work on the 2.x branch: because we wanted the thousands of JSON API sites to be in the best state possible, not be left behind. Additionally, the comprehensive test coverage we added in 1.x guarantees we’re aware of even subtle BC breaks in 2.x! ↩︎

Hook 42: Drupal 8 Interviews: Spotlight on Amber Matz

Mon, 07/09/2018 - 23:42

If you have been to any of the North American Drupal Camps, chances are you recognize Amber Matz. She is the production manager and a trainer at Drupalize.Me. Drupalize.Me is a site for Drupal tutorials and trainings. Amber, along with Joe Shindelar, travel to the various Drupal Camps and DrupalCons and currently run Drupal 8 Theming workshops.

Drupal blog: Why large organizations are choosing to contribute to Drupal

Mon, 07/09/2018 - 13:59

This blog has been re-posted and edited with permission from Dries Buytaert's blog. Please leave your comments on the original post.

During my DrupalCon Nashville keynote, I shared a brief video of Mike Lamb, the Senior Director of Architecture, Engineering & Development at Pfizer. Today, I wanted to share an extended version of my interview with Mike, where he explains why the development team at Pfizer has ingrained Open Source contribution into the way they work.

Mike had some really interesting and important things to share, including:

  1. Why Pfizer has chosen to standardize all of its sites on Drupal (from 0:00 to 03:19). Proprietary software isn't a match.
  2. Why Pfizer only works with agencies and vendors that contribute back to Drupal (from 03:19 to 06:25). Yes, you read that correctly; Pfizer requires that its agency partners contribute to Open Source!
  3. Why Pfizer doesn't fork Drupal modules (from 06:25 to 07:27). It's all about security.
  4. Why Pfizer decided to contribute to the Drupal 8's Workflow Initiative, and what they have learned from working with the Drupal community (from 07:27 to 10:06).
  5. How to convince a large organization (like Pfizer) to contribute back to Drupal (from 10:06 to 12:07).

Between Pfizer's direct contributions to Drupal (e.g. the Drupal 8 Workflow Initiative) and the mandate for its agency partners to contribute code back to Drupal, Pfizer's impact on the Drupal community is invaluable. It's measured in the millions of dollars per year. Just imagine what would happen to Drupal if ten other large organizations adopted Pfizer's contribution models?

Most organizations use Open Source, and don't think twice about it. However, we're starting to see more and more organizations not just use Open Source, but actively contribute to it. Open source offers organizations a completely different way of working, and fosters an innovation model that is not possible with proprietary solutions. Pfizer is a leading example of how organizations are starting to challenge the prevailing model and benefit from contributing to Open Source. Thanks for changing the status quo, Mike!

Evolving Web: Decoupling Drupal with Gatsby

Mon, 07/09/2018 - 09:39

Gatsby is a really fast React-based static site generator. You can use it to create a static site, with content pulled from Drupal and other content management systems. 

Why Use Gatsby?

Unlike dynamic sites which render pages on-demand, static site generators pre-generate all the pages of the website. This means no more live database querying and no more running through a template engine. Performance goes up and the maintenance cost goes down.

Static site generators have been evolving over the last few years. Tools like Jekyll, Gatsby, Hexo, Hugo become more and more popular. They have been chosen by developers who want a simple website/blog solution. They need very minimal server setup and have a low maintenance cost. However, static site generators usually require writing content in Markdown, which is not a great authoring experience for most content editors.

On the other hand, content management systems such as Drupal and Wordpress can provide a very powerful back-end. Having a WYSIWYG editor and content types help editors to manage content more easily and systematically. However, maintaining a CMS requires hosting a web server and database, and opens you up to security vulnerabilities and performance issues.

Gatsby stands in between the simplicity and robustness of static site, and the versatile back-end of a content management system. Using Gatsby means that you can host the CMS in-house and publish content generated by Gatsby as a static website. The first thing you’ll notice about Gatsby is how amazingly fast it is.
 

How to Integrate Drupal and Gatsby

In this tutorial, we are going to put together a demo that pulls Drupal content into a Gatsby site. We’ll borrow content of this awesome blog post to create a list of coffee types in Drupal, then transfer the list content to Gatsby.

This goal can be achieved with 4 steps:

  1. Build a Drupal server
  2. Build a Gatsby site
  3. Fetch content from the Drupal server
  4. Publish the Gatsby site
1. Build a Drupal server

Let’s say we already have a Drupal 8 site installed. We’ll need to:

  • Create a content type name Coffee with three fields: Title, Body and Image
  • Turn Drupal into an API server by installing 2 modules jsonapi and jsonapi_extras.
  • Give Anonymous user permission to Access the JSON API resource list
  • Verify that the API server is working well by going to http://[your-site]/jsonapi as an Anonymous user. The page should show up with all information of your API server

Tips

  • If you use Chrome, use JSON Viewer Chrome extension to view JSON data in a better format
  • If you don’t set permission for Anonymous user to Access JSON API resource list, you’ll get error 406 - Not acceptable when trying to connect to Drupal from Gatsby
  • If you don’t have jsonapi_extras installed, you’ll get error 405 - Method Not Allowed when query data from Gatsby
2. Build a Gatsby Site

First, make sure you have node and npm installed on your computer. Verify it by typing node -v and npm -v into Terminal

node -v v10.1.0 npm -v 5.6.0

Install Gatsby’s command line tool

npm install --global gatsby-cli

Create a new Gatsby site and run it, I’ll call my Gatsby site coffees.gatsby

gatsby new coffees.gatsby cd coffees.gatsby gatsby develop // start hot-reloading development environment

By default, the new Gatsby site is accessible at localhost:8000

3. Fetch Content from the Drupal Server

At this step, we’ll be creating a new simple page /coffees that displays all the coffee types from the Drupal site.

Create the /coffees page

Create a new page in Gatsby is as simple as creating a new JS file. All Gatsby pages should be stored in /src/pages. In this case, we’ll create the file coffees.js in /src/pages and add the following code in coffees.js:

import React from "react" const CoffeesPage = () => ( Different types of coffee ) export default CoffeesPage

This simple code does one thing: create a new page at /coffees. The content of this page is a heading h1 with the text “Different types of coffee”

Query Drupal content using GraphQL

In order to pull data from Drupal 8 site, we’ll need to install the gatsby-source-drupal plugin

// in your coffees.gatsby folder npm install --save gatsby-source-drupal

Configure the gatsby-source-drupal plugin

// In gatsby-config.js plugins: [ ... { resolve: 'gatsby-source-drupal', options: { baseUrl: 'http://dcmtl2018-demo.server/', apiBase: 'jsonapi', // endpoint of Drupal server }, } ],

After adding the plugin configuration, the site should still be functioning. If Gatsby throws a 406 error, check the permission on the Drupal site; if Gatsby throws a 405 error, make sure module jsonapi_extras is enabled.

Build GraphQL to query all coffee nodes from Drupal

Gatsby comes with an in-browser tool for writing, validating and testing GraphQL queries named GraphiQL, and it can be found at localhost:[port]/___graphql, in our case it’s localhost:8000/___graphql

Let’s try querying all the Coffee nodes in this tutorial

After building the query successfully, let’s go back to the coffees.js file to execute the query.

export const query = graphql` query allNodeCoffee { allNodeCoffee { edges { node { id title body { value format processed summary } } } } } `

Then, update the const CoffeesPage to display the title and body content:

const CoffeesPage = ({data}) => ( Different types of coffee { data.allNodeCoffee.edges.map(({ node }) => ( { node.title } ))} )

Thanks to hot-reloading, we can see the sweet fruit of the work right after saving the file

So far, we have done:

  • Create an API server with Drupal and jsonapi, jsonapi_extras
  • Create a Gatsby site with page coffees.js that “reads” content from Drupal server

Let’s move the the last step of the tutorial: publish Gatsby site.

4. Publish the Gatsby Site

Gatsby names itself as a static static site generator, meaning its main purpose is to generate a bunch of static HTML, JS, CSS and images files. This action can be done by only one command:

gatsby build

Once finished, checkout /public folder to see result of your hard work along this long tutorial. Deploying your site is now simply copy/push contents in /public to server.

 

Conclusion

In this tutorial, we got to know how to:

  • Create a new Gatsby site
  • Install new plugin in Gatsby
  • Use GraphiQL to write, validate and test GraphQL query

I personally find that Gatsby is a good solution for setting up a simple blog. It’s easy to install, very fast, and requires zero server maintenance. In a future blog post, I’ll talk about how to integrate more complex data from Drupal into Gatsby.

+ more awesome articles by Evolving Web

DrupalEasy: DrupalEasy Podcast 211 - Tara King - Diversity and Inclusion Contribution Team

Mon, 07/09/2018 - 08:45

Direct .mp3 file download.

Tara King, Customer Success Engineer with Pantheon, Drupal Diversity and Inclusion leadership team, and a member of Core Mentoring Leadership team joins Mike Anello to talk about Drupal's Diversity and Inclusion Contribution Team - and how you can get involved.

Discussion DrupalEasy News Sponsors Follow us on Twitter Subscribe

Subscribe to our podcast on iTunes, Google Play or Miro. Listen to our podcast on Stitcher.

If you'd like to leave us a voicemail, call 321-396-2340. Please keep in mind that we might play your voicemail during one of our future podcasts. Feel free to call in with suggestions, rants, questions, or corrections. If you'd rather just send us an email, please use our contact page.

Agiledrop.com Blog: AGILEDROP: Drupal Security Tips

Mon, 07/09/2018 - 06:55
Drupal is the leading enterprise web content management framework.  With this popularity, of course, comes the increased risk to security. While Drupal in itself, out-of-the-box is widely considered to be very secure, there are additional methods that one must definitely undertake in order to ensure their Drupal site stays attack-proof as much as possible.   Minimum Administrative Privileges In Drupal, users with administrative privileges have access to each and every section of the site. This, of course, means that administrative privileges in the wrong hands could prove to be the end of… READ MORE

ThinkShout: Automatic Page Generation with Custom Entity Routes

Mon, 07/09/2018 - 05:02

One of the most useful items in the Drupal 8 toolbox is the Paragraphs Module. By creating custom paragraph types, you can have much finer control over the admin and content creation process in Drupal.

A recent client of ThinkShout needed a content type (office locations) to include ‘sub-pages’ for things like office hours, services, and other items depending on the location. Most of the sub-page content was pretty simple, but they also needed to have direct links, be printable, and have the same header as the parent page. This ruled out an Ajax solution.

We’ve been using Paragraphs to make configurable content throughout the site, and since the sub-pages only have Title and Content fields, we thought they would be a good fit here as well. We then decided to explore the possibility of using custom entity routes to fulfill the other requirements.

To start, we created two additional view modes for the sub-page paragraphs called Sub-page and Menu link containing the Content and Title fields respectively. By keeping these fields in separate view modes, we make it much easier to work with them.

Next we created a custom module to hold all of our code, ts_sub_pages. In addition to the standard module files, we added the file ts_sub_pages.routing.yml, which contains the following:

ts_sub_pages.sub_pages: path: '/node/{node}/sub-page/{paragraph}' defaults: _controller: '\Drupal\ts_sub_pages\Controller\TSSubPagesController::subPageParagraph' _title_callback: '\Drupal\ts_sub_pages\Controller\TSSubPagesController::getTitle' options: parameters: node: type: entity:node paragraph: type: entity:paragraph requirements: _permission: 'access content'

This defines a unique system path based on the parent node ID and the paragraph entity ID. It would look like https://example.org/node/12345/sub-page/321. It also defines the call to the controller and the title_callback, essentially a location where we can create functions to manipulate the entity and its route. The options define the things to pass into the controller and title callback functions, and we also define access permissions using requirements.

One of the odd things about the controller and title_callback calls is that they look like a path, but are not. They have a predefined (and minimally documented) structure. You must do the following to make them work:

  • Create two folders in your module: src/Controller (case is important).
  • Create a file called TSSubPagesController.php - this must match the call.
  • Define a class matching TSSubPagesController in TSSubPagesController.php
  • Define a function matching subPageParagraph inside the TSSubPagesController class.

Example below. The names of the controller file, class, and function are up to you, but they must have the same case, and the file and class must match.

Digging into the TSSubPagesController.php file, we have a setup like so:

<?php namespace Drupal\ts_sub_pages\Controller; use Drupal\Core\Controller\ControllerBase; use Symfony\Component\HttpFoundation\Request; use Drupal\node\Entity\Node; use Drupal\paragraphs\Entity\Paragraph; /** * TS Sub Pages controller. */ class TSSubPagesController extends ControllerBase { /** * {@inheritdoc} */ public function subPageParagraph(Paragraph $paragraph, Node $node, Request $request) {

Here we have the namespace - this is our module. Note again that the src is taken for granted. Next are the Symfony/Drupal use statements, to pull in the classes/interfaces/traits we’ll need. Then we extend the ControllerBase class with TSSubPagesController, and define our subPageParagraph function. The function pulls in the $node and $paragraph options we defined in ts_sub_pages.routing.yml.

Now we can finally get to work on our sub-pages! Our goal here is to bring in the parent node header fields on every sub-page path. In the Drupal admin interface, go to ‘Manage Display’ for your content type. In our case it was /admin/structure/types/manage/location/display. Scroll to the bottom and under ‘Custom display settings’ you’ll find a link to ‘Manage view modes’. We added a mode called sub-page, and added all of the fields from our Location’s header.

Now we can bring that view of the node into the sub-page using the subPageParagraph function we defined above:

<?php public function subPageParagraph(Paragraph $paragraph, Node $node, Request $request) { $node_view_builder = \Drupal::entityTypeManager()->getViewBuilder('node'); $node_header = $node_view_builder->view($node, 'sub_page'); $paragraph_view_builder = \Drupal::entityTypeManager()->getViewBuilder('paragraph'); $paragraph_body = $paragraph_view_builder->view($paragraph, 'sub_page'); return ['node' => $node_header, 'paragraph' => $paragraph_body]; }

We get the node and paragraphs using getViewBuilder, then the view modes for each. The node’s ‘sub-page’ view mode contains all of the header fields for the node, and the paragraph ‘sub-page’ view mode contains the paragraph body. We return these, and the result is what looks like a page when we visit the base paragraph url of /node/12345/sub-page/321. The title is missing though, so we can add that with another small function inside the TSSubPagesController class (we call it using the _title_callback in ts_sub_pages.routing.yml):

<?php /** * Returns a page title. */ public function getTitle(Paragraph $paragraph, Node $node) { $node_title = $node->getTitle(); $paragraph_title = $paragraph->field_title_text->value; return $node_title . ' - ' . $paragraph_title; }

Now we need to build a menu for our sub-pages. For this we can just use the ‘sub-pages’ paragraph field on the parent node. In the admin display, this field is how we add the sub-page paragraphs, but in the public-facing display, we use it to build the menu.

First, make sure you include it in the ‘default’ and ‘sub-page’ displays as a Rendered Entity, using the “Rendered as Entity” Formatter, which has widget configuration where you need to select the “Menu Link” view mode. When we set up the Paragraph, we put the Title field in the ‘Menu Link’ view. Now the field will display the titles of all the node’s sub-pages. To make them functional links, go to the ‘Menu Link’ view mode for your sub-page paragraph type, make the Title a ‘Linked Field’, and use the following widget configuration:

Destination: /node/[paragraph:parent_id]/sub-page/[paragraph:id] Title: [paragraph:field_title_text]

Next we need to account for the fact that the site uses URL aliases. A node called ‘main office’ will get a link such as /locations/main-office via the Pathauto module. We want our sub-pages to use that path.

We do this by adding a URL Alias to the sub-page routes on creation (insert) or edit (update). In our module, we add the following functions to the ts_sub_pages.module:

<?php /** * Implements hook_entity_insert(). */ function ts_sub_pages_entity_insert(EntityInterface $entity) { if ($entity->getEntityTypeId() == 'paragraph' && $entity->getType() == "custom_subpage") { _ts_sub_pages_path_alias($entity); } } /** * Implements hook_entity_update(). */ function ts_sub_pages_entity_update(EntityInterface $entity) { if ($entity->getEntityTypeId() == 'paragraph' && $entity->getType() == "custom_subpage") { _ts_sub_pages_path_alias($entity); } }

These get called every time we add or update the parent node. They call a custom function we define just below. It’s important to note that we have a custom title field field_title_text defined - your title may be the Drupal default:

<?php /** * Custom function to create a sub-path alias. */ function _ts_sub_pages_path_alias($entity) { $sub_page_slug = Html::cleanCssIdentifier(strtolower($entity->field_title_text->value)); $node = \Drupal::routeMatch()->getParameter('node'); $language = \Drupal::languageManager()->getCurrentLanguage()->getId(); $nid = $node->id(); $alias = \Drupal::service('path.alias_manager')->getAliasByPath('/node/' . $nid); $system_path = "/node/" . $nid . "/sub-page/" . $entity->id(); if (!\Drupal::service('path.alias_storage')->aliasExists($alias . "/" . $sub_page_slug, $language)) { \Drupal::service('path.alias_storage') ->save($system_path, $alias . "/" . $sub_page_slug, $language); } }

This function gets the sub-page paragraph title, and creates a URL-friendly slug. It then loads the paragraph’s node, gets the current language, ID, and alias. We also build the system path of the sub-page, as that’s necessary for the url_alias table in the Drupal database. Finally, we check that there’s no existing path that matches ours, and add it. This will leave old URL aliases, so if someone had bookmarked a sub-page and the name changes, it will still go to the correct sub-page.

Now we can add the ‘Home’ link and indicate when a sub-page is active. For that we’ll use a custom twig template. The field.html.twig default file is the starting point, it’s located in core/themes/classy/templates/field/. Copy and rename it to your theme’s template directory. Based on the field name, this can be called field--field-sub-pages.html.twig.

The part of the twig file we’re interested in is here:

{% for item in items %} <div{{ item.attributes.addClass('field__item') }}>{{ item.content }}</div> {% endfor %}

This occurs three times in the template, to account for multiple fields, labels, etc. Just before each of the for loops, we add the following ‘home’ link code:

{% if url('<current>')['#markup'] ends with node_path %} <div class="field__item active" tabindex="0">Home</div> {% else %} <div class="field__item"><a href="{{ node_path }}">Home</a></div> {% endif %}

Next, we make some substantial changes to the loop:

{% set sub_text = item.content['#paragraph'].field_title_text.0.value %} {% set sub_path = node_path ~ '/' ~ sub_text|clean_class %} {% if url('<current>')['#markup'] ends with sub_path %} <li{{ item.attributes.addClass('field__item', 'menu-item', 'active') }}>{{ sub_text }}</li> {% else %} <li{{ item.attributes.addClass('field__item', 'menu-item') }}><a href="{{ sub_path }}">{{ sub_text }}</a></li>

Here, sub_text gets the sub-page title, and sub_path the path of each sub-page. We then check if the current url ends with the path, and if so, we add the active class and remove the link.

And that’s it! The client can now add as many custom sub-pages as they like. They’ll always pick up the parent node’s base path so they will be printable, direct links. They’ll have the same header as the parent node, and they will automatically be added or deleted from the node’s custom context-aware menu.

Hmm, maybe this would make a good contributed module?

Drupixels: Enable debug mode and error reporting for local development in Drupal 8

Sun, 07/08/2018 - 10:57
If you are just starting with Drupal 8 then one of the most important things you should know is to enable the debug mode and error reporting on Drupal 8. This is really important for backend as well as frontend developer to the full error on your screen while you are working because you might not know what is the exact issue with the site with just a generic error statement.

Matt Glaman: Filtering out invalid entity references in Drupal 8

Fri, 07/06/2018 - 16:15
Filtering out invalid entity references in Drupal 8 mglaman Fri, 07/06/2018 - 15:15 Today I was working on a custom Drupal 8 form where I needed an option to purge existing entities and their references on a parent entity before running an import. It seemed pretty straightforward until I saw "ghost" values persisting on the parent entity's inline entity form. Here's my journey down the rabbit hole to fix broken entity reference values.

PreviousNext: Automate your Drupal accessibility testing with aXe and NightwatchJS

Thu, 07/05/2018 - 19:52

Automated accessibility tools are only one part of ensuring a website is accessible, but it is a very simple part that can catch a lot of really easy to fix issues. Issues that when found and corrected early in the development cycle, can go a long way to ensuring they don’t get compounded into much larger issues down the track.

by Rikki Bochow / 6 July 2018

I’m sure we all agree that the accessibility of ALL websites is important. Testing for accessibility (a11y) shouldn’t be limited to Government services. It shouldn’t be something we need to convince non-government clients to set aside extra budget for. It certainly shouldn’t be left as a pre-launch checklist item that only gets the proper attention if the allocated budget and timeframe hasn’t been swallowed up by some other feature.

Testing each new component or feature against an a11y checker, as it’s being developed, takes a small amount of time. Especially when compared to the budget required to check and correct an entire website before launch -- for the very first time. Remembering to run such tests after a components initial development is one thing. Remembering to re-check later down the line when a few changes and possible regressions have gone through is another. Our brains can only do so much, so why not let the nice, clever computer help out?

NightwatchJS

NightwatchJS is going to be included in Drupal 8.6.x, with some great Drupal specific commands to make functional javascript testing in Drupal super easy. It's early days so the documentation is still being formed.  But we don't have to wait for 8.6.x to start using Nightwatch, especially when we can test interactions against out living Styleguide rather than booting up Drupal.

So lets add it to our build tools;

$ npm install nightwatch

and create a basic nightwatch.json file;

{ "src_folders": [   "app/themes/my_theme/src/",   "app/modules/custom/"   ], "output_folder": "build/logs/nightwatch", "test_settings": { "default": { "filter": "**/tests/*.js", "launch_url": "http://127.0.0.1", "selenium_host": "127.0.0.1", "selenium_port": "4444", "screenshots": { "enabled": true, "on_failure": true, "on_error": true, "path": "build/logs/nightwatch" }, "desiredCapabilities": { "browserName": "chrome" } } } }

We're pointing to our theme and custom modules as the source of our JS tests as we like to keep the tests close to the original JS. Our test settings are largely based on the Docker setup described below, with the addition of the 'filter' setting which searches the source for .js files inside a tests directory.

A test could be as simple as checking for an attribute, like the following example;

/** * @file responsiveTableTest.js. */ module.exports = { 'Responsive tables setup': (browser) => { browser .url(`${browser.launch_url}/styleguide/item-6-10.html?hideAll`) .pause(1000); browser.expect.element('td').to.have.attribute('data-label'); browser.end(); }, };

Which launches the Styleguides table component, waits a beat for the JS to initiate then checks that our td elements have the data-label that our JS added. Or is could be much more complex.

aXe: the Accessibility engine

aXe is a really nice tool for doing basic accessibility checks, and the Nightwatch Accessibility node module integrates aXe with Nightwatch so we can include accessibility testing within our functional JS tests without needing to write out the rules ourself. Even if you don't write any component specific tests with your Nightwatch setup, including this one accessibility test will give you basic coverage.

$ npm install nightwatch-accessibility

Then we edit our nightwatch.json file to include the custom_commands_path and custom_assertions_path;

{ "src_folders": ["app/themes/previousnext_d8_theme/src/"], "output_folder": "build/logs/nightwatch", "custom_commands_path": ["./node_modules/nightwatch-accessibility/commands"], "custom_assertions_path": ["./node_modules/nightwatch-accessibility/assertions"], "test_settings": {   ...   } }

Then write a test to do the accessibility check;

/** * @file Run Axe accessibility tests with Nightwatch. */ const axeOptions = { timeout: 500, runOnly: { type: 'tag', values: ['wcag2a', 'wcag2aa'], }, reporter: 'v2', elementRef: true, }; module.exports = { 'Accessibility test': (browser) => { browser .url(`${browser.launch_url}/styleguide/section-6.html`) .pause(1000) .initAccessibility() .assert.accessibility('.kss-modifier__example', axeOptions) .end(); }, };

Here we're configuring aXe core to check for wcag2a and wcag2aa, for anything inside the .kss-modifier__example selector of our Styleguide. Running this will check all of our components and tell us if it's found any accessibility issues. It'll also fail a build, so when hooked up with something like CircleCI, we know our Pull Requests will fail.

If we want to exclude a selector, instead of the .kss-modifier__example selector, we pass an include/exclude object { include: ['.kss-modifier__example'], exclude: ['.hljs'] }.

If you only add one test add one like this. Hopefully once you get started writing Nightwatch tests you'll see how easy it is and eventually add more :)

You can include the accessibility test within another functional test too, for example a modal component. You'll want to test it opens and closes ok, but once it's open it might have some accessibility issues that the overall check couldn't test for. So we want to re-run the accessibility assertion once it's open;

/** * @file dialogTest.js */ const axeOptions = require('../../../axeOptions.js'); // axeOptions are now shareable. const example = '#kssref-6-18 .kss-modifier__example'; const trigger = '#example-dialog-toggle'; const dialog = '.js-dialog'; module.exports = { 'Dialog opens': (browser) => { browser .url(`${browser.launch_url}/styleguide/item-6-18.html?hideAll`) .pause(1000) .initAccessibility(); browser.click(trigger).pause(1000); browser.expect.element(dialog).to.be.visible; browser.assert.attributeEquals(dialog, 'aria-hidden', 'false'); browser.assert.accessibility(example, axeOptions); browser.end(); }, };Docker

As mentioned above this all needs a little docker & selenium setup too. Selenium has docs for adding an image to Docker, but the setup basically looks like this;

@file docker-compose.yml services:   app:   [general docker image stuff...] selenium: image: selenium/standalone-chrome network_mode: service:app volumes: - /dev/shm:/dev/shm

Then depending on what other CI tools you're using you may need some extra config. For instance, to get this running on CircleCI, we need to tell it about the Selenium image too;

@file .circleci/config.yml jobs:   test:    docker:   [other docker images...] - image: selenium/standalone-chrome

If you're not using docker or any CI tools and just want to test this stuff locally, there's a node module for adding the selenium-webdriver but I haven't tested it out with Nightwatch.

Don’t forget the manual checks!

There’s a lot more to accessibility testing than just these kinds of automated tests. A layer of manual testing will always be required to ensure a website is truly accessible. But automating the grunt work of running a checklist against a page is one very nice step towards an accessible internet.

Tagged accessibility, CI, nightwatchjs, Functional Testing, aXe

Hook 42: DrupalCamp Asheville

Thu, 07/05/2018 - 13:19

 

Drupal Camp Asheville is next weekend! Join members of the Drupal community of all different skill levels and career paths for 3 days of science, trainings, sessions, and sprints.

Hook 42’s own Adam Bergstein will be sharing some insights on everybody’s favorite online Drupal testing environment, Simplytest.me. He recently took over ownership of the service, and we are excited to hear about where it is going next!

Along with helping organize the camp, our newest team member, Jonathan Daggerhart, will be leading an all day training on Drupal 8 Essentials.

Amazee Labs: Zurich Drupal Meetup - July

Thu, 07/05/2018 - 09:12
Zurich Drupal Meetup - July

We will host the next Durpal Meetup at our Amazee Labs offices in Zurich on 11 July.

Anli de Jager Thu, 07/05/2018 - 15:12

We'll focus our discussions on progressive decoupling, GraphQL, and Drupal.

So, if these topics interest you make sure to join us for an evening of great talks and collaboration.

We hope to see you there!

Date: Wednesday, 11 July 2018

Time: 6:30 PM - 9:00 PM

Venue: Amazee Labs, Förrlibuckstrasse 30, Zürich

OpenSense Labs: Drupal Commerce vs Magento: Comparison 2018

Thu, 07/05/2018 - 07:29
Drupal Commerce vs Magento: Comparison 2018 Shankar Thu, 07/05/2018 - 16:59

When you want to buy a new shirt from an emporium, you look for the best shop that can exhibit different pieces of nicest clothes for you to choose from. Not much has changed with the emergence of the online stores. You still strive to buy the best thing available on the best e-commerce site. Drupal Commerce and Magento offer an amazing e-commerce platform for the digital businesses to establish themselves as the best in the industry.


Drupal Commerce and Magento have different features and specifications. Determining which one is the most suited for your organization’s needs is a matter to be pondered over. To yield the best crop, comparing both of them side-by-side can give you a better picture.

A brief look at Drupal Commerce and Magento Drupal Commerce

With more than 60,000 sites powered by Drupal Commerce, a module, it is one of the most flexible e-commerce solutions for websites and applications of all sizes. Drupal 7 and Drupal 8 are the versions to be maintained and supported for the years to come. While building an online store in former will require Commerce 1.x, and for the latter Commerce 2.x.

Drupal Commerce is meritorious in various ways bringing e-retailers more traffic to drive more results. Usage statistics for Drupal Commerce from Drupal.org
  • Easy usage: It helps in the development of administration system and customised business workflow. Hence, even a person without the technical expertise can make alterations and do the testing which makes it a marketing driven commerce.
     
  • Digital experience: It enhances digital experience by incorporating commerce, content and, community and brings e-retailers more traffic.
     
  • Business-centric: It gives you the structure that you need for your online store without setting preamble and assuming your business requirements. Simply saying, it is the king of customisation.
     
  • King of content: Built on top of enterprise CMS, it offers a commerce platform to intertwine content and products seamlessly, thereby driving both online and offline sales via a wonderful UX, optimised merchandised tools, and efficient SEO techniques.
     
  • Easy configuration: With its robustness and flexibility, it is configurable to fit right for your enterprise’s needs. It is great for any sort of physical and non-physical items that demand payment models like recurring, licensing or subscription.
     
  • Cost-effective: Being an open source software, it is an affordable solution.
     
  • Adaptable: It allows third-party integrations and enhancements in the features and functionalities to adapt the changing needs of a business. Be it Authorize.net, Braintree, PayPal, Stripe, Amazon Pay and a lot of other payment gateways, it provides many integrations. Thus, it is highly extensible.
The Drupal project of Obermeyer had to encounter a lot of challenges with 3 different websites for 3 distinct audiences in addition to autonomous ERP and B2B ordering systems. Their objective was to optimise the digital processes of Obermeyer and provide an excellent digital experience through a centralised platform. By leveraging the benefits of Drupal 8 and Drupal Commerce 2.x, they built a robust enterprise ecommerce solution that could offer friction-free online shopping experience to their B2B, B2C and VIP audiences. Magento

Providing a flexible shopping cart system, Magento is another e-commerce platform that is offering online merchants a robust solution and the control over the appearance, content and, functionalities of their e-commerce websites. It provides tools for a powerful marketing, SEO and, catalog-management. Ranging from small-scale sites to the large-scale enterprise SaaS-based systems, it has wonderful platform to accommodate the needs of business requirements.

Usage statistics for Magento from BuiltWith.com

Some of the highpoints of Magento where it is advantageous is mentioned below:

  • Installation: It is very simple when it comes to installation and features additional layouts and plug-ins.
     
  • Affordable solution: It offers an efficacious and cost-effective solution being an open source software.
     
  • Payment gateways: Some of its worldwide available payment gateways are PayPal, Authorize.net, CyberSource etc.
Drupal Commerce vs Magento. Which one is the better e-commerce platform? An exhaustive and side-by-side comparison of Drupal Commerce and Magento needs to be hammered out to see where they score higher.

Metrics/Platforms

Drupal Commerce

Magento

Installation

Easy installation from the scratch

Easy installation from the ground up

Usage

Easy to learn

Steep learning curve

Content management

Can create complex content relationship

Has a basic content management system

Catalogue management

Dynamic addition or removal of products possible

Conventional way of displaying products

Mobile responsiveness

Fully responsive

Fully responsive

Features and functionalities

Great for complex online stores

Good only for simple shopping carts

Multilingual

Has in-built modules to create multilingual site

Multiple Store Views helps in building multilingual site

Administration interface

Easily customisable

Difficult to customise

Cost constraints

Being open source technology, it is absolutely free to use

Shown identity shift with newer modules not free to use

Business-driven

Can be easily integrated with multiple ventures of your business

Would require two or more independent systems to integrate with multiple ventures

SEO tools

Plethora of modules available for enabling SEO

Built-in SEO-friendly tools available

Skills required

Required minimal coding skills

Requires advanced coding skills

Integration scope

Third party integrations possible

Provision for third party integrations available

Security

Most security focussed

Relatively less secure

Installation

Drupal Commerce allows developers to install from the scratch. An installation package has to be downloaded and installed. It requires you to install Drupal CMS and then enable the Drupal Commerce module or use an installation profile to automatically enable it.

Magento allows developers to install from the ground up by providing installation package to be downloaded, loaded and installed.

Verdict: Match is drawn. No side wins this contest. Both of them offer a wonderful platform for the installation.

Usage

Having a site running on Drupal and familiarity with the Drupal features and functionalities would make your life easier in learning how to use Drupal Commerce.

Although Magento gives complete control to the users and comes with some amazing features, Magento professionals are required to fully extract its potential and use its functionalities to the fullest. Someone new to Magento may lead a tough learning curve initially.

Verdict: Thus, it has the advantage over Magento as you do not have to consume a lot of your time in learning its functionalities. Being adept with Drupal helps in learning Drupal Commerce in no time.

Content Management

Built on top of Drupal, Drupal Commerce permits you to endlessly create content types with custom fields and attributes, and cool media tools, thereby improving the editing experience. Its provision for content relationships helps you in forming a listicle consisting of related products and blog posts. Also, you can customise your landing page with optimised product lists. It is great for businesses who consider providing content as the fulcrum of their growth.

In contrast, Magento has a very basic content management system. It only allows you to add pages, some content to different categories of pages and attributes to the products. Other than this, you will step into its custom domain. This incurs heavy costs and increases ongoing support.

Verdict: Drupal Commerce is, obviously, miles ahead and hugely beneficial in this arena.

Catalog Management

A product can be dynamically added or removed from the product list automatically in Drupal Commerce. You can develop a traditional catalog-like experience and it automatically follows the way you have organized your products and uses attributes associated with them. Be it tables, grids, or lists, products can be exhibited on your website in any pluggable style with each having their own appearance and feel. This immensely helps in building user engagement.

Magento has a very conventional way of how products are organized and added to the list. You build a set of classifications in a catalog root where the products can be included. A product can fall into multiple categories depending on their type and how the users are trying to find these products. But Magento follows a strict provision for displaying products on the site. For instance, only lists and grid views are available for product listings.

Verdict: Drupal Commerce, apparently, has the upper hand in this area.

Mobile responsiveness

Drupal Commerce offers screen flexibility for building the most powerful and versatile business engine in today’s market. Drupal themes help in lending a fully responsive design for the ecommerce website.

For instance, SShop is a bootstrap based Drupal 8 theme with out-of-the-box support for Drupal Commerce. It provides multi-level responsive header menu, slideshow on the homepage, and a custom layout.

Corolla is another stupendously stylised colorable Drupal theme that comes with 6 preset colour schemes, custom colour options, mobile responsive features, and box shadow and background texture options among others.

eStore is a fully responsive, bootstrap-based Drupal theme that comes with cool functionalities like product layouts collection to scan through and choose from, slider content types, custom field additions to the Default Product Type, content types included in the configurations etc.

Magento is not far behind. It adopts responsive web design approach to craft websites that provide an optimal viewing experience across a wide range of devices. For instance, the out-of-the-box Magento Blank and Luma themes offer a fully responsive design.

Verdict: Providing a beautiful design to the site and making the responsive across devices, both the platforms are equally beneficial in this area.

Features and functionalities

One of the foremost advantages of Drupal Commerce is that it has a huge and a growing list of modules for enabling customised features and functionalities. Building a custom application and providing a wonderful digital experience thereby incorporating features that are beyond commerce, it is adaptable to changing times and business needs. Whether you need to add web services or forums, it gives you the space for adding a functionality which is entirely new.

Magento comes with a nice set of features which is really good for your online store. If you are looking at a product on the basis of its published feature set, then Magento is good for you. So, in simple words, if you just need to develop a shopping cart, opt for it.

Verdict: Hence, unlike Magento, not only a simple shopping cart, Drupal Commerce can customise your online shopping site for a wide array of needs.

Multilingual capabilities

Drupal 8 comes with out-of-the-box modules for translating the content on the website. Its built-in language handling abilities help in delivering localised digital experiences by letting you choose from 94 languages.

It also offers the individuals who work on the site - site administrators, content editors, and translators - the choice of their own language. From pages to the taxonomy terms, it can translate everything. Even the configuration of the site like blocks, panels, views etc. can be translated.

With every Magento theme, you can incorporate a multilingual storefront without a lot of trials and tribulations thereby allowing the user to switch between languages with ease.

Store Views are the store instances in the Magento platform. So, if you are running a ecommerce website with a single language, single Store View is required. For multilingual sites, it allows you to have multiple Store Views.

Verdict: Both the platforms share this arena equally and lets you run a great multilingual online store.

Administration interface

Without even making changes to the code, you can customise the admin interface as per your needs in Drupal Commerce. It lets you create screens for specific users with certain administrative actions.

Magento gives you a structured and properly defined way to govern the entire store. If you are complacent with its built-in admin interface functionalities, then it’s good for you. But if the need arises for some more features, then a lot of development effort is required to make customizations.

Verdict: With provision to choose from a huge list of Drupal modules to make the admin UX better, it should be a no-brainer of a choice that points to Drupal Commerce.

Cost constraints

Drupal is an open source technology. So, it is absolutely free to use.

In contrast Magento is showing an identity shift according to Gartner’s Magic Quadrant for Digital Commerce in 2018. It states that some newer modules in Magento are not being offered as open source. Moreover, the prospects drawn to Magento because of its strong reputation in the open source community should be careful of this tectonic shift and determine whether or not they are comfortable with this.

Verdict: Being an open source technology, Drupal Commerce clearly excels in this category. 

Business-Driven

Drupal Commerce can help make a site with any of these functionalities or all at once: a multilingual site, a multisite, community forum or an online store. So, it can integrate with multiple business platforms under one umbrella.

To integrate Magento with any other part of business ventures like forums or subscriptions, you will have to confront with two independent systems working with each other. This calls for third-party solutions to keep data between several systems in sync and up-to-date.

Verdict: Drupal Commerce clearly wins this race by miles with a wide pool of benefits for your business.

SEO tools

You can enable SEO through superabundance of modules in Drupal like Pathauto, XML Sitemap, Redirect etc.

With Magento’s out-of-the-box SEO tools, you can easily adjust URLs, meta information, and verify search terms and ranking through Google integration.

Verdict: Both of them offer a perfect platform for making the site search friendly.

Skills required

Without much coding experience, you can easily pick the basics quickly in Drupal Commerce. For someone with minimal coding knowledge, it offers modules like Views, features etc to power user interface. Even for a code genius, aforementioned modules can be really helpful.

With heavy usage of objects, inheritance, and programming concepts, someone with minimal experience of coding, may find it tough in Magento to get to the scheme of things quickly. Also, understanding and detecting Magento extension clashes can be tricky. It has an okay documentation available but active Magento community and a wide range of training and support offering can be pretty useful.

Verdict: With an active, engaging and responsive Drupal Community and a plethora of training and support services available, Drupal Commerce should be the default choice.

Integration scope

Drupal Commerce can seamlessly integrate with third-party tools. Some of the examples are: 

Magento offers support for third-party integrations too. Some of the example categories are:

  • Magento theme or template integration allows you to use third-party themes from Themeforest or TemplateMonster.
  • SMS gateway integration with third-party service providers like Kapsystem is possible in Magento
  • It also allows Payment Gateway integration with Paypal, Amazon Pay etc.
  • Magento API integrations can be done with Xero, Salesforce, Box etc.
  • It can also be integrated with leading CMS like Drupal.

Verdict: Both the platforms offer third-party integrations.

Security

Statistically proven, Drupal is the best security focussed CMS among the major CMS platforms like Wordpress, Joomla and Magento. Both the infected websites and the infection rate of Magento were much higher than that of Drupal in the Hacked Website Report from Sucuri.

Source: Sucuri.net

Verdict: Drupal is the most secure CMS and should be prioritised over Magento.

Conclusion

For the best e-commerce site, digital firms have to thrive on the e-commerce platforms for a unique and a masterclass of a website. Drupal Commerce and Magento provide a platform for setting up an e-commerce site efficiently. To understand what suits your organisation’s needs, you have to properly understand their features and functionalities to choose wisely. A side-by-side comparison delineates that Drupal Commerce has the clear edge over Magento and comes out as the winner.

We provide Drupal Commerce services to help you scale up. Contact us at hello@opensenselabs.com to gauge which platform would be the best for your enterprise needs.

blog banner blog image Drupal Commerce Magento Drupal Drupal 8 Ecommerce platform Commerce 2.0 Magento vs Drupal Drupal Commerce vs Magento Ecommerce Payment gateway Ecommerce website Content Management System CMS Mobile Responsive Multilingual Site Open Source Multisite SEO Blog Type Articles Is it a good read ? On

InternetDevels: Search in Drupal 8: great opportunities and useful modules

Thu, 07/05/2018 - 03:22

Convenient searching is a keystone of good website navigation and usability. Help your website users always find what they want, and they will lose all the addresses of your competitors, completely forgetting their names.

Read more

myDropWizard.com: CiviCRM secrets for Drupalers: Membership Directory

Thu, 07/05/2018 - 01:58

We're Drupalers who only recently started digging deep into CiviCRM and we're finding some really cool things! This series of videos is meant to share those secrets with other Drupalers, in case they come across a project that could use them. :-)

In the screencast below, I'll show how how you can set-up a public membership directory in Roundearth's CiviCRM! Many organizations have a need to have public visitors see their member individuals or organizations

Watch the screencast to see how to add a public membership directory with Roundearth:

Video of CiviCRM Membership Directory

Some highlights from the video:

  • Add Memberships to a Contact
  • Create a Smart Group of Members
  • Boom: Create a Public Membership Directory

Please leave a comment below!

Evolving Web: Scholarship to Learn Drupal 8 - Call for Applications

Thu, 07/05/2018 - 00:38

Evolving Web is proud to announce that it's giving out a scholarship for our 5 Day Drupal 8 Training in Toronto in October. Everyone is welcome to apply. We encourage students, freelancers, and those looking to switch careers to attend, as well as anyone wanting to learn Drupal for a project for the social good.

If you're not in Toronto, we'll be offering scholarships in the future for trainings in other cities and online. Subscribe to our newsletter for updates.

Why do you need this training?

Many businesses, higher education institutions, governments, NGOs, and communities around the world use Drupal to power their websites. Drupal is one of the most sought-after CMS technologies and there is generally a shortage of Drupal talent.

Learning Drupal can be challenging and while there are many resources online, it's useful to have a hands-on course to guide you through the process.

What will you learn in this program?

During the course, you'll learn how to build websites with Drupal from top to bottom. The class is divided into three parts:

Who should apply?

Scholarships will be given to those who demonstrate an eagerness to learn, a vision for how they will implement their Drupal skills, and a need for financial support.

It's important that applicants clearly indicate why they want to learn Drupal and what projects they're planning to use Drupal for.

Some experience with HTML, CSS, and programming is required to take the course. Please contact us if you have questions about the course requirements.

Course Details
  • Dates: October 1, 2018 - October 5, 2018 - 5 full days of intensive learning
  • Location: Toronto (Centre for Social Innovation)
  • Cost: $1,850
  • We’re giving away 1 scholarship for the full class as well as 3 scholarships for 50% discounts  — apply today!
  • Deadline for applications: August 17, 2018
+ more awesome articles by Evolving Web

DrupalEasy: Introducing our online, hands-on, 2-hour Professional local development with DDEV workshop

Wed, 07/04/2018 - 09:49

We're happy to announce that we've partnered with the folks at DRUD Tech to create and deliver a live, online, hands-on workshop that will teach the basics of professional local Drupal development using DDEV-Local.

DDEV-Local is a Docker-based local development environment that is designed to get your projects up-and-running quickly on your local development machine. DDEV-Local can be used with Drupal 7 and 8 as well as WordPress and other content management systems.

One of the big advantages of using this type of local development environment (as opposed to an old-school WAMP/MAMP-type solution) is that DDEV helps to ensure that every member of the development teams is using the exact same development environment, increasing productivity and decreasing the chances of environment specific bugs. Furthermore, you'll find that getting team projects up-and-running on your local machine is super-fast!

If you've been reading our recent blog posts or listening to our podcast, then you probably already know that we've taken a keen interest in local development environments lately. 

In fact, we've been diving deep into local development environments for almost a full year now, as we're in the process of evolving our long-form training classes to teach and utilize a more professional local development environment solution. A couple of months ago, we decided to standardize our trainings on DDEV, and since then we've been talking with the DRUD Tech folks (the creators of DDEV) about putting together a workshop that will provide students what they need to get up-and-running with DDEV-Local.

Here's a quick overview of what this new workshop will cover:

  • What is DDEV? 
  • Installing DDEV-Local (Mac OS X and Windows 10 Pro)
  • Getting an existing project up-and-running in DDEV-Local
  • Everyday DDEV-Local commands and functionality
  • DDEV-Local integration with hosting providers  
  • Updating DDEV-Local
  • DDEV-Local Tips and Tricks
  • Getting help with DDEV-Local

The first workshop will take place on Wednesday, July 18, 2018, 1-3pm EDT, and the cost is $75. Following the completion of the workshop, you'll have access to the 20+ page curriculum PDF as well as more than 10 screencasts demonstrating installation and basic usage of DDEV-Local. 

Register today and start using a professional local development environment!

We’ll be running the workshop monthly, so If you can't make it on July 18, upcoming dates include: 

  • Wednesday, August 22, 2018, 10am-noon EDT
  • Wednesday, September 19, 2018, 9-11am EDT

Quotes on this page were shared by participants in our beta-test of the course


 

Evolving Web: How to Create a Custom Views Argument Plugin in Drupal 8

Wed, 07/04/2018 - 09:24

When you're building a site, it's standard to set up "pretty URLs". For example, we'd rather see /catalog/accessories than /taxonomy/term/18 and we'd rather see /about-us than /node/19. With Drupal, we use the Pathauto module to configure these types of standard content paths. But in some cases, there's no option to create a nice URL pattern, so we end up seeing those URLs.

This happens with Views that are filtered by a taxonomy term in the URL. The result is that you end up seeing taxonomy term IDs in the URL (e.g. catalog/19/search) rather than a pretty URL (e.g. catalog/accessories/search).

In this tutorial, I'll walk you through how to create a field that is used to make generate URLs for Views pages that take taxonomy terms as contextual arguments. Using this approach, a site editor will be able to configure what the URL path will look like.

Assumptions

It has been assumed that you know:

  • The basic concepts of Drupal 8.
  • How to configure fields.
  • How to configure a view with a contextual filter.
  • How to create a custom module in Drupal 8.
  • How to create a custom plugin in Drupal 8.
The solution

We're going to use the core taxonomy argument plugin (Drupal\taxonomy\Plugin\views\argument\Taxonomy) to help us fix this problem. The plugin takes a term ID from the URL and passes it to Views. We'll override the plugin so that it takes a string from the URL (slug), and then looks up the associated term ID. All the rest of the functionality we'll leave as is.

Step 1: Content and field configuration

To make the example work, we need the following configuration to be in place (most of this you get out-of-the-box with Drupal, you'll just need to add the Slug field):

  • A taxonomy vocabulary named tags.
  • Tags should have the following field:
    • Field name: Slug
    • Machine name: field_slug
    • Type: Text (Plain)
    • Size: 32 characters
  • A content type named article.
  • Article should have the following field:
    • Field name: Tags
    • Machine name: field_tags
    • Type: Entity reference (to taxonomy terms from Tags)
    • Number of values: At least one

Configuring field slug on taxonomy term

Step 2: Create a custom module

Create a custom module called custom_views_argument. Declare a dependency on the views module in the .info.yml file.

Step 3: Implement hook_views_data_alter()

Reference: custom_views_argument.module

The hook_views_data_alter() hook tells Views about the various database tables, fields and the relevant plugins associated to them. We implement this hook to tell Drupal to include our custom argument plugin which we will create in the next step.

Step 4: Create a custom views argument plugin

Reference: CustomTaxonomySlug.php

Next we implement the CustomTaxonomySlug class with a proper annotation @ViewsArgument("custom_taxonomy_slug"). This tells the Views module that the class is a special class which implements a custom Views argument plugin. We extend the Drupal\taxonomy\Plugin\views\argument\Taxonomy class and override one important method CustomTaxonomySlug::setArgument().

public function setArgument($arg) { // If we are not dealing with the exception argument, example "all". if ($this->isException($arg)) { return parent::setArgument($arg); } // Convert slug to taxonomy term ID. $tid = is_numeric($arg) ? $arg : $this->convertSlugToTid($arg); $this->argument = (int) $tid; return $this->validateArgument($tid); }

All we do here is catch the argument from the URL and if it is a slug, we use a convertSlugToTid() method to retrieve the underlying taxonomy term ID. That is it! The rest of the things are handled by the taxonomy plugin.

Step 5: Create Demo Content

Now that everything is in place, we'll put our solution to the test. Start by creating some demo content. Create 2-3 articles and assign them some tags. The tags are created, however, they don't have a slug.

Once done, go to the Admin > Structure > Taxonomy > Tags page and edit the tags and give them nice URL slugs containing only English alphabet letters, numbers and dashes. For real projects, you might need to use a custom or contrib module to automatically generate slugs (see the machine_name module).

Step 6: Configure a View

Now we're all set! The last step is to create and configure a View which will put everything together.

  • Create a View of Content. You can name it Blog.
  • Create a page display and set it's URL to /blog/%.
  • Add a relationship to taxonomy terms referenced from field_tags.
    • We do this to be able to use the Slug field in a filter. 

Configure a relationship with taxonomy terms

  • Now, define a contextual filter for the Slug using the custom argument plugin which we created.
    • Click on the Add button for Contextual filters
    • Choose the Slug filter which we created. It should have the name we had defined in our plugin, i.e. Custom: Has taxonomy term with slug.
    • Optionally, specify a validation criteria for Taxonomy Term and specify the Tags vocabulary.

Configuring the custom views argument plugin for contextual filters

  • Save the view for the new configuration to take effect.

And we're done! If you visit the /blog/SLUG, you should see all the articles which have the taxonomy term associated to SLUG. Here, SLUG refers to the value you put in the Slug field for the tag. E.g. if you have a tag named Accessories and you wrote accessories in the Slug field, you should go the the URL /blog/accessories.

Next steps

If you're looking for hands-on training about how to develop Drupal modules, we offer professional Drupal training online and in-person. See the full Drupal 8 Module Development training curriculum for details.

+ more awesome articles by Evolving Web

OPTASY: These Are the 15 Best Drupal Security Modules Worth Installing on Your Website

Wed, 07/04/2018 - 09:22
These Are the 15 Best Drupal Security Modules Worth Installing on Your Website adriana.cacoveanu Wed, 07/04/2018 - 13:22

See? I'm a woman of my word as you can see: here I am now, as promised in my previous post on the most effective ways to secure a Drupal website, ready to run a “magnifying glass” over the best Drupal security modules. To pinpoint their main characteristics and most powerful features and thus to reveal why they've made it to this list.

And why you should put them at the top of your own Drupal security checklist.

So, shall we dig in?
 

1.  Login Security  

It's only but predictable that since the login page/form is the entry to your Drupal site, it is also the most vulnerable page there, as well.

Therefore, secure it!

Droptica: Droptica: How does the Droptica team continuously improve its work? See examples from our Sprint Retrospective meetings

Wed, 07/04/2018 - 07:07
Your development team constantly makes the same mistakes and does not improve the quality of its work. Are you familiar with this issue? If so, we have a solution for you and it’s really simple – it’s called Sprint Retrospective.  What is Sprint Retrospective? This is one of the meetings in the SCRUM methodology. The meeting takes place on a regular basis, at the end of each sprint. At Droptica, we hold a meeting once every two weeks. During the meeting, the development team analyses the previous sprints and deliberates about what can still be improved – how to eliminate errors and blockers, and how to speed up work.

Pages