Blogs

Improve internal news with the new Attini Feature: Shadow Copy

With the popularity and the usage of our internal news app Attini Comms expanding, we figured it was time to bring a new feature into the game for those organizations that have become lean, mean news machines. It is called Shadow Copy and it allows you to copy an article that you found in a random channel to your own channel. This way news can spread even faster.

Who is it for

This feature is very useful for organizations with over ten news channels and/or where the creation of news is a decentralized activity. It will help owners of a channel to easily bring in interesting articles from other channels and present them to their own audience. And the readers of those channels of course also benefit, for this will bring more news their way.

How it works.

Since the Shadow Copy feature is quite a big pack of functionalities, it helps if we explain the way it works based on a scenario. So, let’s first meet the stars of our story.

First up is Mark, the Attini administrator of Basically A Random Company Inc. (BARC Inc.) who makes sure that all the twenty channels that he administers are kept in tip top shape. Second is Nicole, the Director of the Sales Department and owner of the Sales news channel. Third we have Harold, an employee of BARC Inc.

Let’s say that in the IT channel of BARC Inc. a very interesting article has been published about how to safely connect to Wi-Fi networks at work. The tips and tricks mentioned in the article are very useful to all the colleagues working for the IT Department to setup secure connections. So, Mark has read the article and during a chat at the water cooler tells Nicole and Harold about it. Harold is a real news buff who has already read the article and encourages Nicole to check it out. Nicole knows that a lot of Sales colleagues are working from all different places, so she decides to look up the article in the IT channel.

When the three colleagues look at the article they all have a different set of options presented to them due to the new Shadow Copy feature:

In the top right corner of the article page, a menu is presented based on the permissions of the user looking at the page. Each channel for which you have permission to publish an article will be shown. That means that Mark sees all channels (because he is the Attini Admin), Nicole only sees the Sales Channel (because she has Contribute permissions only for that channel) and Harold doesn’t even see the menu since he owns no channels at all.

After reading the article, Nicole decides to copy it to the Sales Channel. During the copy a new article is created within the Sales Channel. However, this is no ordinary article. For one, Nicole cannot edit the content of the article. She wanted to copy the article and that is what she gets, an exact copy. But more important, a relationship is created between the original article (further known as the mother) and the copied article (further known as the child).

This relationship is used to update the child article as soon as a change is published in the mother article and to show where the child article originally came from. Showing were the article originally came from is important information. It leads Harold the news buff from the child article to the channel that possibly contains more articles about internet security. And it leads other channel-owners to the mother article from which they can also create a copy for their news channel. This is important because the feature does not allow copying a child article. To put it in family terms, we will allow creating sibling articles but aim to avoid a grandchild or even greatgrandchild article being created.

Now that the article is copied, Nicole has brought the tips and tricks about internet security to her audience. With an easy click of a button she has made the Sales colleagues happy. However, we did not forget about Harold our news buff. Because like a true news enthusiasts, Harold has subscribed himself to all the twenty channels that BARC Inc. has. Which means that he gets all the articles from the IT Channel (containing the mother article) and all articles from the Sales Channel (containing the child article). Of course, we filter out the duplicates from every feed so no one sees the same article twice. Mark, Nicole and Harold see the following items in their feeds.

Harold only sees one of the Internet Security articles, because the duplicates are filtered out of his feed. In this case the child is filtered out and the mother article is featured in his feed. As long as the reader is subscribed to the channel which holds the mother article, they will only see the mother article in their feed. If an article is copied to multiple channels, so multiple child articles exist, and a user is not subscribed to the channel holding the mother article, then the oldest child article that would be presented to this user would be visible.

So, if the Internet Security article would be copied to a third channel, Harold would still only see the mother article in his feed. In the event that Harold would unsubscribe himself from the IT channel, that holds the mother article, he would then only see the oldest child article in his feed, which is the one in the Sales channel, because Nicole copied the article first.

By filtering out the duplicates we can guarantee that the Shadow Copy feature can be used without limit, bringing news articles to new audiences and not polluting the feeds of people who already had access to the article. And by only showing the oldest article a user is subscribed to, we make sure that every user sees the article when it has the most news value for them.

Customers using the <link to sticky news>Sticky news feature</link to sticky news> feature need not fret as the deduplication feature of Shadow copy means that Harold will only see one of the sticky articles if the sticky article was the one copied by Nicole.

How to get started

Are you interested in adding the Shadow Copy feature to your Attini Comms installation? Than feel free to contact us by sending an email to attini@rapidcircle.com. We will be happy to run you through all the details and get you started!

Do you want to give your company news an additional boost in reaching all the colleagues, please make sure you check out the Sticky News feature we just released.

PNP PowerShell: Maintain all your Termset data across tenants

The Term store manager available in SharePoint enables companies to manage their enterprise-specific taxonomy easily through new term groups and term sets. This metadata can then be referenced by users for selecting choices when filling in profile or content-related data.

Enterprise taxonomies can sometime contain dozens of groups with too many term sets and terms to manage, update of copy manually. There are standard ways offered to export taxonomies into a .csv file and importing them to term store on a different tenant.

But what if you want to not only export term sets and term labels but also their other term-specific data and configuration such as:

  • Localised labels for each term
  • Synonyms
  • Navigation settings (if the term set is intended for site navigation)
  • Custom properties associated with each term
  • The exact GUID of the term

The above data may not be interesting for users, however for administrators and content creators and developers these additional elements of a term can are very important.

Fortunately, we can export all the term sets configuration using the powerful and very useful PNP PowerShell cmdlets

Thanks to the efforts in the Microsoft Patterns & Practices (PnP) community, we now have a set of useful PowerShell cmdlets that can help us. The list of cmdlets is continuously growing and we find that as administrators we can accomplish many more tasks using CSOM

Specifically, the cmdlet that we can use is:

Export-PNPTermGroupToXml – Enables us to export a Term group and all its underlying terms’ setting to an xml-based output file

ImportPnPTermGroupFromXml – Enables us to import a Term group from an XML file

Export your taxonomy

To use the cmdlet, I first need enter credentials connect to my SPO tenant content type hub site collection:

Once connected I simply need to pass an output file and the term group name I want to export

Looking at the exported XML you can see that all the relevant term settings included GUID are now available to import to another term store

Importing your taxonomy

The import is done in a similar manner.

Connect to the destination tenant

Pass the XML file as a parameter as seen below

That’s it!

Improve internal news with the new Attini Feature: Sticky News

  All news is important, but some news is more important than other. And that is why we bring you Sticky News as a new feature inside the number one Office 365 News application.

Sticky News is a feature that is available as of this summer for Attini Comms. The new feature allows the creator of a news article to flag their article as being extra important which will make sure the news article remains at the top of the list in the news feeds. So instead of being pushed down by other articles with a more recent publishing date, the sticky news article will remain at the top of the news feed for the number of days that was specified by the writer of the article. To give it an extra touch of importance it is possible with a slight design change to highlight a sticky news article by displaying it in an offsetting color, depending on the type of Attini Reader web part you are using.

How To Use

As you have noticed, the technical and functional changes that are required to turn a news article into a sticky news article are not that major. However, the feature has proven to be a total game changer for organizations that are already used to Attini. The reason why is simple: having the ability to make your articles sticky, gives you a big advantage when spreading your news. And as our favorite spiderlike superhero so obviously stated “with great power comes great responsibility”.

The Sticky News functionality can be granted to each news channel that you have within your Attini Comms setup and this is done via the Attini Comms Dashboard. This means that only an Attini Administrator has the power to turn a news channel into a sticky news channel that can produce sticky news articles. Also, it is good to note that the power can be taken away as well by the Attini Administrator. So, the control over who gets the power and who doesn’t lies with the person administrating the whole Attini Comms landscape at the customer side, exactly in the place where you want to have this control.

Once the Attini Administrator has granted the Sticky News functionality to a channel, the owners of that channel have the power to create Sticky News articles. Per article they have the option to upgrade it to a Sticky News article or just publish it as a regular article. And since the difference is indicated by a simple flick of the switch, it is easy to make your news sticky. Even if you posted an article as a regular news story and the next day you decide to make it Sticky, you simply check the box and publish the article again to move it to the top of the feed.

Per channel you can have a maximum of three Sticky News articles. Let’s assume you post one sticky news article every day. If you would start on a Monday then Monday sticky article is on spot one and the rest of the spots are filled by other articles based on the priority model that you have chosen (most recent, most liked, most viewed, etc.). On Tuesday, your Tuesday sticky article will take spot one, the Monday sticky article will move to spot two and spot three and lower is taken by the rest. On Wednesday, your Wednesday sticky article will take spot one, the Tuesday sticky article will take spot two, the Monday sticky article will move to spot three and spot four and lower is taken by the rest. On Thursday, everything moves again when your Thursday sticky article is published, but now the Monday sticky article will not be considered sticky anymore and will get ranked amongst the rest according to the ranking model. Below, an overview is given of how the feed would look on each of the days.

Up to this point the story is plain and simple. You have one channel fitted with the sticky news and the owner of that channel can choose which stories to make sticky. However, most of our customers have multiple news channels, which in practice means anywhere from two to over fifty channels. And that is when the Sticky News scenario becomes interesting. Because the more news you have, the bigger the benefit is to have a way to make an article stand out. On the other hand, the more people who have the power to put their news at the top of the feed for days, the higher the risk will be for editors gaming the system.

To make sure you get the benefit from the Sticky News functionality and not turn the business of publishing news into a free for all fight about who can get their article at the top of the feed, some best practices need to be taken into consideration.

  • First, only channels with a wide audience should be candidates for the Sticky News upgrade. If a channel only has 50% or less of the company in the audience set, then the news published there is already targeted at such a specific group that probably every story is equally important for the reader.
  • Second, only channels with a high frequency of publishing articles need a functionality like Sticky News. If you only publish an article once a week, it will stay at the top of your feed anyhow, so no need for making it stick.
  • Third, make sure that if multiple people have the power to create Sticky News that they communicate with each other on a regular basis. If the power lies with a central communications team or colleagues near one another they have ample opportunities to discuss which stories should dominate the feeds the coming days. The situation that you want to avoid is two colleagues that never talk to each other battling it out over the news channels to get their news on top. The result of such a situation is seen in the comment section of every YouTube video and always turns ugly.
  • Fourth, be sure to monitor the use of the Sticky News functionality after it has been granted to a channel. If the functionality is not used properly (or not used at all) it is worth a conversation with the owners of that channel. And don’t be shy to take away the power again, because having not so interesting articles dominating the feeds may make the creator of that news very happy, but could make your readers frustrated.

How to get started

Are you interested in adding the Sticky News feature to your Attini Comms installation? Than feel free to contact us by sending an email to attini@rapidcircle.com. We will be happy to run you through all the details and get you started!

Do you want to give your company news an additional boost in reaching all the colleagues, please make sure you check out the Shadow copy feature we just released.

PNP PowerShell: Managing Content Type Artefacts across a single or multiple Office 365 tenants

Creating content types in Sharepoint has always been relatively easy for site and content administrators. Furthermore, with the Content Type Hub feature, custom content types can be centrally defined and pushed out to all site collections. The challenges and difficulties, however arise when you want to make some inherent changes to these site objects or want these exact site objects to be present across your DTAP (Dev, Test, Acceptance & Production

For instance,

  • I’ve created my custom content types in my dev tenant. Now I want to migrate the changes to production?
  • How can I update an internal name of a field with a content type and ensure that the changes are reflected everywhere?

Actions like these were (and still are) generally avoided because there’s be no good way of accomplishing them. It’s still very good practice to thoroughly prepare and review what’s needed before creating custom content types. Making changes to these artefacts still requires effort especially when there is content that is already using these artefacts.

Fortunately, the ability to manage existing content types has gotten easier. Thanks to the efforts in the Microsoft Patterns & Practices (PnP) community.

We now have a set of useful PowerShell cmdlets that can help us. The list of cmdlets is continuously growing and we find that as administrators we can accomplish many more tasks using CSOM

You can go through the PnP Cmdlet documentation here https://github.com/SharePoint/PnP-PowerShell/tree/master/Documentation

I want to focus on creating content types and managing changes to these artefacts you use the following 2 PNP cmdlets

Get-PnPProvisioningTemplate: Enables you to extract a site template with all or a partial set of the site artifacts. The extraction is an xml file which can be reviewed and updated

Apply-PnPProvisionTemplate: Enables you to apply an extracted site template to an existing site. Essentially providing you with a means to apply changes how to all sites in a tenant or a different tenant

The overall process then would look like this:

Create custom artefacts in content type hub

As usual create your fields and content types in the content type hub. I recommend to:

  • Tag these artefacts in a custom group so they are easily identifiable
  • Decide on a naming convention for both fields and content types that helps others to see that these are custom artefacts
  • Avoid spaces in field names when initially creating them. Otherwise you end up with internal names looking like this

Where the space is replaced with a hexadecimal “_x0020_”. This is not a critical issue, however can be avoided and corrected.

I’ve created a content type in a unique group:

With a custom field Document Category

Extract artefacts using Get-PnPProvisioningTemplate:

Using the cmdlet, I can first enter credentials connect to my SPO tenant content type hub site collection

Then extract only the Fields and Content Types using the -Handler attribute

Make changes to your artefacts in XML

In your xml output file, you will find all the Fields and Content Types. You search for the relevant ones by looking for the group name (“Demo Only” in my case)

You can now edit field properties such as the StaticName and Name

Be sure to update the reference to the update field name in the corresponding content types as well. In my case I had created a “Demo Content type”

Modified to

Once your satisfied with you changes save the XML file and you are ready to apply the changes to the original content type hub site collection

Apply changes using

Connect to your content type hub site collection again:

Run the Apply-PnPProvisioningTemplate with the updated xml file as an input:

I changed the static name of “Document_x0020_Category” to “Document_Category” which is not reflected in when viewing the field column URL:

This was a simple demonstration of the scripting tools available to manage site artefacts change that previously were difficult or impossible to update.

Changes can now be pushed out to all site collections by republishing the updated content type:

Using this same technique, with a bit more preparation you can also extract a set of custom content types for one tenant and apply them to another. Thereby keeping field names, content types and their internal GUIDs all intact!

The forgotten part of an Office 365 migration: Network Connectivity

Migrating an on premise email infrastructure to Office 365 is pretty straightforward; whether you have Exchange 2003 (even!), 2007, 2010 or 2013, lots of documentation and migration scenarios are available on the Web to make it a successful migration project. After popping the champagne when the first mailboxes moved successfully, the first complaints reach the service desk: the (Outlook) performance is getting slower and slower. What went wrong? Did we not follow all the procedures in the “Deployment Guides”?

The slow performance is probably related to Network Connectivity. But how can we solve this? The answer is not simple; it contains several important steps to optimize the network performance.

1. TCP Window Scaling:

To use a high bandwidth link efficiently, the connection must be filled with as much data as possible as quickly as possible. With a TCP Window Size limited to 64k when Window Scaling is disabled, not all the available bandwidth is used.

Increasing the Window Size beyond 64k, the sending machine can push more data onto the network before having to stop and wait for an acknowledgement from the receiver.

Check this setting on your network perimeter devices.

2. TCP Idle time settings:

Network Perimeter Devices (like firewalls) are normally designed for internet access to Web Pages. This means TCP Sessions were not expected to be idle for a long time. If there were any idle TCP Sessions, the firewall simply closed them. Users were not affected by this using web pages only, but now the situation is different: we’re using Outlook to connect to our Office 365 Mailbox. Outlook leaves TCP Sessions open for a period of time (as long as Outlook is open) and when the firewall kills the “idle” TCP connection, Outlook hangs, causes disconnect pop-ups or even prompts the user for a password.

Solving this problem, make sure the perimeter devices are configured consistently; keep the SSL/TCP idle Session Timeouts for “normal” traffic around 2-3 minutes, but create a separate group for Office 365 traffic and make sure the timeout for this group is higher than 2 hours (as Windows will send a keep alive by default after 2 hours).

3. Latency:

Latency is the time it takes for content to get from a server or service to your device and is measure in milliseconds. Faster is better. It can be caused by a number of factors, like low bandwidth, a sparse connection or transmission time.

Outlook connects to a Client Access Server in Exchange Online which redirects your request to the server where your mailbox is located. These datacenters are on high speed backbones, but you have to make sure your connection is taking the traffic as fast as possible to that datacenters with as low latency as possible at your site.

4. Proxy Authentication:

To ensure your Office 365 connections complete quickly is to check proxy authentication is completing quickly. Better is not to use a proxy at all!

If Proxy Authentication is required, at least make an exception for the Office 365 URL’s and applications:

  • Allow outbound connections to the following destination: *.microsoftonline.com
  • Allow outbound connections to the following destination: *.microsoftonline-p.com
  • Allow outbound connections to the following destination: *.sharepoint.com
  • Allow outbound connections to the following destination: *.outlook.com
  • Allow outbound connections to the following destination: *.lync.com
  • Allow outbound connections to the following destination: osub.microsoft.com
  • Ports 80/443
  • Protocols TCP and HTTPS
  • Rule must apply to all users.

5. DNS Performance:

If name resolution takes time, it results in a poorly performing Office 365 Infrastructure. Make sure your DNS servers are in the same region; for example, if you use “8.8.8.8” as your (external) DNS server, you might get the “wrong” result. Your request is pointed at the US servers first, which causes delays in the connection to the servers in your region.

office-365-migration-blog
office-365-migration-blog

6. TCP Max Segment Size:

Maximum Segment Size is a TCP level value which is the largest segment which can be sent on the link minus the headers. To obtain this value, take the IP level Maximum Transmission Unit (MTU) and subtract the IP and header size. A standard Ethernet Connection uses a packet size of 1500 bytes leaving us with an MSS of 1460 bytes. Ensure your TCP packets are able to contain the maximum amount of data possible. Low values will affect network performance.

7. Selective Acknowledgement:

TCP is a reliable protocol which ensure delivery of all data. It does this by the ACK’s indicating it’s received up to a certain point in the data stream. With SACK enabled, we’re able to tell the sender we’re missing a packet and which packets we already got. The sender can just retransmit the missing packet without sending the successive packets. This greatly increases the efficiency of the TCP protocol (it is enable by default in Windows). Check you network devices whether it is enabled or not.

Optimizing Network Connectivity is highly recommended to ensure to fully utilize the available network bandwidth and have the best performance possible for network traffic. If any of those resources are performing badly then the end customer is likely to experience poor performance.

By eliminating the above described topics in a random order, you'll ensure you are providing the best possible Office 365 experience for your users.

Microsoft and Xamarin better together

It is not a very fresh recent news anymore, but still we'd like to dive into some more details about why the acquisition of the company Xamarin earlier this year by Microsoft is advantageous and exciting for us and our clients.

Who is Xamarin? What do they do?

Xamarin is a young company, about three years old but growing very fast, which its primary business is to create tools to let developers build native desktop and mobile applications for non-Microsoft platform by using Microsoft modern tools and languages as Visual Studio and C#.

With the increased popularity and adoption of mobile devices and mobile apps, obviously a pure Microsoft developer is in trouble having to build a native mobile application for either iOS, Android or both platforms.

Any of these would require not only knowing the specifics of each different system well, but also adopting different toolsets and languages for each platform to support. Furthermore, this type of development is in contrast with modern methodologies which help delivering custom Apps in shorter time, to be more reliable, and with possible more frequent updates and upgrades.

Having to manage a single team of developers and make use of one unique toolset, fully supported and evolved, is a huge advantage in developing a cross-platform mobile application, and for sure Xamarin's strong point.

Why the acquisition has been positive?

Xamarin used to be a commercially licensed product, with yearly subscription licenses to be acquired for developers. Especially for this reasons, and despite what you would think if you go read about the history behind Xamarin (started in reality about 15 years ago with Mono project), it was indeed a pretty closed source and commercial product.

During this year's MS Build Developers Conference, the partnership has been publicly announced, together with some revision on the licensing model of Xamarin, opening it to wider audience, and publication of most Xamarin's source code on GitHub. This has generated a big increase indeed in the interest and adoption of Xamarin widely.

Finally, the merge can be of course only advantageous to get Xamarin's tools still better and better integrated with the rest of MS development tools.

Are there alternatives? Is it the best?

In cross-platform mobile development, so when having to create a mobile application for multiple mobile OS, two main 'worlds' exist: native Apps vs. hybrid Apps.

Native apps, as the name suggests, run native on the platform where they're running on, getting the best of performance and possibility to make use of all specific platform capabilities.

Hybrid apps are always created with a single language/toolset which is HTML, JavaScript and CSS, and they run on different platforms in a web browser, simulating the experience of a native app (so you won't notice for instance web browser's address bar or such).

In many cases it's very hard to notice with type has been chosen when using an App we have downloaded or obtained in other ways, and many apps have been recently changed from hybrid to native (don't know any that has done the opposite path) without the users can notice because interface and experience remained pretty the same.

As mentioned earlier native apps are anyway in general more performant, even if harder to create with a single toolset, where Xamarin is for sure leading and few other minor competitors.

At Rapid Circle we think native apps and Xamarin are the best, and probably Microsoft thinks that too :)

xamarin
xamarin

Can Rapid Circle help my business creating a mobile app with Xamarin?

Absolutely. We've been developing custom mobile apps for our clients using Xamarin since 2014, and since then Rapid Circle is as well an authorized Xamarin Consulting Partner.

The Apps we've been developing normally connects with LoB or Azure environments, having the possibility to fully work in case of lack of intermittent network connectivity, or other such features based on customer's needs; but they'll always run native!

Our preference are off course Office 365 and Azure, and for this I'd suggest you to check out some of the Apps we've published in the App stores:

We'd love to hear your feedback on them.

What to say more… I am myself also a Xamarin Certified Mobile Developer for the second year now. This certification can be obtained by subscribing to Xamarin University and passing needed exams. As last (but not least), one of our former colleagues in India, software engineer, is now working as engineer at Xamarin, or better to say Microsoft.

We have then proven experience if you need to build a custom mobile application, and we can respond to all your question or needs about Xamarin apps, Xamarin Forms, Xamarin Test Cloud, Application Insights, HockeyApp, and more..

Measuring the heartbeat of Rapid Circle with BI tools

Wouldn’t it be interesting to measure certain ‘online behavior’ that says something about work style, culture, effectiveness and collaboration within an organisation? In order to visualize this weve developed a tool that 1) offers our customers a tool to make (the progress of) change management tangible and 2) demonstrates the added value (ROI) of ICT innovations (like an Office 365 platform). We developed this tool - Organisational Pulse - especially for bigger organisations, but we were also very curious to take the temperature of our own organisation! Despite the fact that we, as a relatively small organisation (with 90 employees), already had a pretty clear image of the way we collaborate and do our jobs, it gave us some interesting and useful insights. We’d like to share these insights with you in this blog.

We also summarized these insights in a infographic. Click HERE for the infographic with our measurements and interpretation.

We can measure the behavior of employees with Organisation Pulse in a way that wasn’t possible before. For example: how often do employees copy their manager in an email, how many recurring meetings does someone have and how often do they send emails during those meetings, how many people on average are added to the ‘To’, ‘Cc’ and ‘Bcc’, and so much more. These are just a few examples of data that can tell a lot about the work style and culture in an organisation. If you want to change these, this tool is your guide to keep track of your progress.

Culture

At Rapid Circle we only add our direct manager as a recipient to only 3% of the emails we send. You could cautiously conclude that we do not try to justify our selves as much, or as you could also say it, we don’t try to ‘cover our asses’. We do see that one of our ‘departments’ copies the manager much more in emails than other departments. Furthermore, we see that with an average of 2,4 recipients added per mail, we don’t have the need to email a lot of people at the same time.

graphs
graphs
graphs2
graphs2

We hardly ever use of the Cc or Bcc function. Based on this we could assume that the political pressure in our organisation is very low. We don’t see the added value of copying a lot of colleagues in emails. It doesn’t take much time to add people to an email, but think of the relevance and efficiency if it takes every recipient three minutes to read it. We think this is the most polluting behavior there is in bigger organisations.

On average we spend 2.76 (of the 40) hours per week at Rapid Circle on structural (mostly weekly) meetings. We also spend 1.47 hours per week on average on non-structural, but planned, meetings. In total, we only spend 4.23 hours per week on meetings. These numbers prove Rapid Circle doesn’t have a culture in which meetings hold a prominent place.

graphs3
graphs3

We expect that for many of our customers these statistics will reveal a lot of interesting, but also confronting information. What will they do with it? To answer this question, we’ll have to wait until this data is available. However, it will be in any way very interesting to dig into for organisation specialists.

Collaboration

There is a lot of cross-silo collaboration within Rapid Circle. We already expected this, because we don’t really have ‘departments’, but instead we have competence groups and a flat organisation. For this analysis we sorted all our employees based on competence, mark that our (project)teams are mainly multidisciplinary. In this regard, it’s logical that we score 65% on cross-silo collaboration. This means that 65% of all the documents we share, emails we send and meetings we have, are all shared with colleagues outside of our ‘department’. Also, our usage of Skype for Business is very high. Interesting, but also very logical, because worldwide we have three offices (Pune (India), Melbourne and Amsterdam) and our work is not dependent on a location; we work a lot from home, at customers and on the road.

graphs4
graphs4
graphs5
graphs5

Effectiveness of our own work

Rapid Circle is a customer oriented organisation. Almost everything we do, like our support, projects, adoption and workshops, we do for our customers. You would expect that more than 50% of our emails are send to at least one external recipient. It appears to be only 48%. Are we too internally focussed? We should investigate that more. On the other hand, if I compare this with the statistics from our customers, 48% is pretty high. Can we conclude that in big organisations in the Netherlands, we just keep each other busy?

graphs6
graphs6

As said before, we don’t spend a lot of time in recurring meetings. This seems to reveal a certain level of effectiveness. during only 14% of all of our meetings one or more email was sent by one or more people present (or should be present) in that meeting. This also tells us something about our effectiveness. We seem to be focused on the ongoing meeting, instead of other things.

graphs7
graphs7

Also interesting are the costs of reading emails. What does this really cost us? We measure this based on the amount of characters in the text (excluding the text upon which an answer is sent), times a constant that indicates how long someone needs to read a X amount of characters, times the amount of recipients. After this, we add up all the emails (per month) and multiply the result with €35,- per hour. At Rapid Circle this sums up, with 90 employees, to an amount of €50.860,- per month. Via a different formula we end up with a allocation of approximately 45 minutes per day per person, if everybody reads all of their emails. In our information-intensive organisation, that’s not much. Besides, added up we send 23.000 emails per month.

graphs8
graphs8

It is our experience that this number is much higher at large organisations. We have seen organisations who receive more than 4 hours worth of emails per day.

Work style

At Rapid Circle, trust and responsibility are important values. We don’t impose working hours and people can work wherever they want. This might lead to the expectation that people work in the weekends as well. However, only 2% of all our emails are sent during the weekend. That’s not much. As an owner of the company you could perceive this in two ways. I choose to interpret it as follows: we do a very good job, because you can see that people can manage their work just fine during the week.

graphs9
graphs9

What’s also interesting are the numbers that indicate what guides our people in their daily work. Are they guided by their inbox or by their own priorities? On average, at Rapid Circle, people are busy sending emails almost 10 quarters a day. During an 8-hour working day, that could be 32. It seems to be that we do not let our inbox guide our workday. Again, we score lower than other organisations, but it is still 10 quarters on average. If you compare that to the amount of emails we receive and how little meetings we have, it is more than that I hoped for.

And again the question is, what do you want to do about it? Delve Analytics, a tool developed to have more insight in your own work style, might be a solution for us. If we cumulate the big data numbers and the individual statistics from Delve employees can become more aware of an effective work style. This only works if we explain it right and if the management team gives the right example though. At Rapid Circle, magement could do better though, because on a daily basis we check our inbox during 26 quarters of the 32 per day.  That means that from just about the moment we get up in the morning till the moment we go to bed, we send an email almost every 15 minutes. I find that shocking. Or is it okay for us? I’d like to hear other people’s opinion.

Moreover, the same statistics indicate that the support team is led by their inbox only 8 quarters per day. Apparently they use other (the right?) tools for messaging within our support system.

Furthermore, we can measure a lot more little things. Things that don’t necessarily say something about work style in general, but about certain behaviors that can be influenced. For example, we only share 17% of the information we share via the cloud (from OneDrive or SharePoint or with a link to the document). So, we still send a lot of attachments via email. We should be ashamed of ourselves as a Microsoft Cloud partner. Okay, we already do a lot better than other organisations, but this number must increase. How? At least by management giving the right example, because, shame on us, that group just scores a sad 8%. The technical team does already way better, with 45% of the attachments sent through the cloud.

Overview IG Organisational Pulse English
Overview IG Organisational Pulse English

Other things we can measure

The statistics displayed here are only a small part of all the statistics we can generate. These other statistics can offer even more insight in culture, usage, costs and added value. For example, the value of knowledge, of documents, which departments are contributing the most to the (re)use of knowledge, to meetings starting and ending late or on time.

Every organisation will interpret and use the statistics in a different way. We also see that what one organisation finds interesting might not be interesting or can even be seen as the pointer of progression of change for another. The dashboard will be different for everybody. Luckily our application is like a Big Data Analytics tool. We can create new charts real quick via Power BI.

Conclusion

The conclusion I draw regarding our own statistics is that on the one hand Organisational Pulse has the most value for especially bigger organisations. On the other hand, these numbers provided a couple interesting insights. For example, the statistics about the usage of cloud services. We use all the functionalities, but not always to the max. What does that tell us about adoption roadmaps at our customers? Changing behavior is really hard. We’re also going to act on setting an example as the leadership team, including myself.

The purpose of applying this tool to our own organisation was mainly to give us a better feeling about how to interpret the data. We can also state that creating a benchmark with more data from customers will be very interesting. This way we can measure how organisations do, compared to the average of a few other (big) organisations.

Would you like to know more? Read more, download the infographic or contact Wilco Turnhout.

Crawling as fast as possible

More and more we are becoming a generation of searchers. And that is not meant in a philosophical, running your fingers through your beard kind of way, but in a very practical everyday reality kind of way. When was the last time an argument you had among friends was not settled by searching the internet for the right answer? And this trend is not just influencing your personal life, but your work life as well. A clear shift is happening within corporations to go to a flatter organization structure, have self-organizing teams and increased cross functional interaction. We are trading in hierarchies for communities. Heck, there are even companies that let their employees pick their own job titles[1].

In this world of less structure one thing becomes more and more important to still be able to do your work: search! However, making sure stuff is available to be found within SharePoint Online is not always straightforward.

Our colleagues Marijke Ursem and Martin Offringa wrote a blog (read it here) about the workings of search in SharePoint and how to make sure the search results are shown just how you like it. So we will not cover any of that here. Instead we will dive into the bag of tricks we have to make sure that content is searchable as quickly as possible.

The Index

For those of you who are new to the subject of search in SharePoint, let us quickly cover some of the basics.

The search results you see in the content search web part or the search results web part are not coming directly from your lists and libraries, but from the search index. The index can be considered as one big bucket with all the searchable content and only stuff that is in the index can found through search.

Based on an automated schedule the index is filled with the latest changes that occurred in your tenant. This is done by the crawl, and in SharePoint Online there are two variants of the crawl: 1) the continuous crawl that runs every 15 minutes and picks up new and changed documents or items and 2) the incremental crawl that runs every 4 hours and picks up changes in the search configuration.

Crawling 1

Schematic to show how the content a tenant, the crawl and the index relate to each other.

Lack of Control

One of the most heard complaints about search in SharePoint Online related to search is that even with the highest of permission levels on your tenant, you are still not fully in control of the crawl. This is because, in contrast with an On Premise situation, the automated schedule cannot be changed. In SharePoint Online, it is Microsoft who runs the show.

But there is no use in complaining, because at the moment there is no option to speed up the crawls. So if you can’t beat them, join them. Because there are some tricks that help you to go as fast as possible when it comes to having your changes crawled in SharePoint Online.

The Basics

First, a document which has no published version will not be crawled. So when you are working inside a document library that has minor (concepts) and major (publications) versioning activated, make sure to publish your documents.

Second, when you add a column to a list of library it will not be crawled if there is no item that has a value for that column. So make sure that at least one item contains a value for this new column, even if that means adding a temporary test item.

The Simple Tricks

Maybe the best analogy for the scheduled crawls is to view them as an old fashioned postman, who is doing his rounds on a fixed schedule. And on his round he comes by a series of classic postboxes with the little red flags on them. The classic postbox works by raising the flag when there is something in it and leaving it down when it is empty. And let’s decide that in this analogy raising the flag is a signal to the postman doing his rounds to empty the postbox.

Furthermore, it is important to know that the crawl acts on value changed. So in our analogy, value changes raise the flag automatically and indicate to our postman to pick up the changes. So if you have a document and you change for example the person mentioned in the “owner” field then this change will automatically be picked up by our postman. However, when you change the way the owner needs to be presented from “account name” to “display name with presence” this change will not automatically raise the flag since no value change occurred. Only a setting change was done.

To make sure your change is picked up anyhow, you can raise the flag yourself via the library settings, which is described in a support article of Microsoft[2].

Crawling 2

The same article also describes how to raise the flag for a whole site and since Microsoft already did an excellent job of explaining how it is done, we have nothing to add to their story.

Crawling 3

When we leave the libraries and list behind us and start getting our hands dirty within the search center of SharePoint Online there are also some tricks we can pull out of our top hat. Within the Search Schema we can have a ball setting up managed properties and mapping all sorts of crawled properties to our managed properties.

For those of you who are new to the Search Schema, crawled properties and managed properties and want to learn more about the topic, we recommend to give the support article Manage the search schema in SharePoint Online a good read.

While you can do a lot of nice and necessary work inside the Search Schema, you will have to do something extra to make sure your changes have effect. The reasoning behind this is that “…Because your changes are made in the search schema, and not to the actual site, the crawler will not automatically re-index the site”[3]. What you need to do is re-index the site which uses the crawled property that you have used in your managed property mapping and then “…site content will be re-crawled and re-indexed so that you can start using the managed properties in queries, query rules and display templates”. Or if the crawled property is only attached to a certain library or list, you can re-index that list which will have the effect that “…all of the content in that library or list is marked as changed, and the content is picked up during the next scheduled crawl and re-indexed”.

So for sites, lists and libraries we have the power to raise the flag and our postman (a.k.a. crawl) will pick up our changes and update them in the index so they are seen in search results.

 

The Advanced Tricks

At this point your question will undoubtedly be what else you can do to give the crawl a kick, because going into every site that you want to raise the flag for one by one is just too much of a hassle.

Well unfortunately, this raising the flag thing is the only instrument we have in the wen interface. Because as said, there is no way to influence the schedule, only ways to influence what is picked up during the next round of our postman. But rest assured, we are not suggesting that you actually go into every site that you have and click a button. We are suggestion that you put others to work for you.

The first option you have is to put Microsoft to work for you. Via the Admin Center you can raise a ticket to Microsoft technical support and ask them to re-index a bunch of sites, a site collection or even all your site collections. It is also possible to request a re-index of all the user profiles. What Microsoft Technical Support then will do is raise the flag for all your content so that everything gets picked up during the next round of the postman. Upside is that Microsoft can do this much more efficient, but the downside is that you still have to wait for the next incremental crawl. And of course, there is waiting involved between raising the ticket and getting a response from Microsoft.

So, where do we have to turn to get even faster results? This is really not a question of who, but a question of what. Because the answer lies in PowerShell. For those of you who want to learn more about Windows PowerShell, this TechNet article is a nice place to start.

With PowerShell we can fire off commands to our SharePoint tenant and, just to name an example, can raise the flag on a bunch of sites. So this puts you back into control and releaves you from waiting on Tech Support to pick up your ticket. Plus, you won’t have to do much scripting, because others have already done it for you. Two scripts that are particularly handy come from Mikael Svenson (https://twitter.com/mikaelsvenson).

The first script enables an admin to trigger a re-index of a site collection and all its sub sites[4]. The way the script raises the flag is by changing the search version property of the site or site collection which ensures that the site will be picked for re-indexing on the next incremental crawl. This is a major time saver in the sense that you do not have to manually trigger re-indexing on every single site

The second script allows you to raise the flag for all the user profiles in your tenant[5]. A user profile is just another content record and for it to be picked up by the crawl it needs a value change. So when you start changing user profile properties it would require a user to change something about their profile before the change is picked up. And since users do not necessarily change their profile’s very often, it might take a while before your change has reached all users. So this script is a major help in activating your change for all profiles in your tenant. Also because there is no way to raise the flag manually on a profile other than to apply a value change to that profile. And actually, this is also what the script of Mikael does. On every profile it overwrites a property value with the same value, which in the eyes of SharePoint is a value change and thus all the profiles are picked up by the next incremental crawl.

 

Summary

When working with search in SharePoint Online you have to deal with the fact that you cannot influence the crawl schedule. Just put it out of your mind and try to accept it. What you can do is make sure that all the changes that your made are picked up as soon as possible by the continuous and incremental crawls that pass by your tenant. Or, to put it in terms of our analogy, making sure that the postman is picking up your message on his very next round.

 

Disclaimer

A lot of the items discussed in this blog have been created, communicated or distributed by others as first. We certainly want to put credit where credit is due, so we tried to do our absolute best to always show the source or inventor of the trick where this was possible.

Also, the scripts mentioned in this blog should only be used and deployed by people who understand what they are doing. Never let code loose on your tenant that you do not understand yourself. This warning has nothing to do with PowerShell or these scripts in particular, but is just part of good sensible ownership for any admin.

If needed Rapid Circle can help you understand and safely deploy these scripts on your tenant and help you save time in configuring search for your SharePoint Online environment.

[1] http://www.fastcodesign.com/3034987/evidence/the-case-for-letting-employees-choose-their-own-job-titles

[2] https://support.office.com/en-us/article/Manually-request-crawling-and-re-indexing-of-a-site-a-library-or-a-list-9afa977d-39de-4321-b4ca-8c7c7e6d264e?ui=en-US&rs=en-US&ad=US

[3] https://support.office.com/en-us/article/Manually-request-crawling-and-re-indexing-of-a-site-a-library-or-a-list-9afa977d-39de-4321-b4ca-8c7c7e6d264e?ui=en-US&rs=en-US&ad=US

[4] http://www.techmikael.com/2014/02/how-to-trigger-full-re-index-in.html

[5] http://www.techmikael.com/2014/12/how-to-trigger-re-indexing-of-user.html

Microsoft Forms preview: The ins & outs

Microsoft Forms was formaly introduced via an Office Blog post "Microsoft Forms—a new formative assessment and survey tool in Office 365 Education" and in preview since April 2016 for Office 365 Education subscribers. It allows users to create quizzes, questionnaires, assessments and subscription forms.

The product

Microsoft Forms is a product that specifically targets the education market and allows users to create web based forms in which different types of questions can be created. The tool lends itself to create a pop quiz for a classroom, a questionnaire to gather qualitative information about a topic or a simple subscription form. As said it is specifically targeted towards the education market and therefore only Office 365 Education licensed users will be able to use the product (in preview). Watch the video released by the product group below;

Microsoft aims to deliver an easy and fast solution for teachers to create assesments, which can filled out via all types of browsers on all types of devices. Don't let the simple interface fool you,you do have powerfull options available, such as validation, notification and export to Excel.

Not the new InfoPath

If you are anything like me, your first reaction when hearing that there is something new called Microsoft Forms will most likely to be: “Finally, the replacement product for InfoPath has arrived!”. Well, it has not. Microsoft Forms does not come close to the full suite of options we know from InfoPath. And more importantly, there are no signs whatsoever from Microsoft that it is supposed to replace InfoPath in the future. For that we have to look at Microsoft PowerApps. Microsoft Forms is a product to create assessments, quizes, surveys, etc. Let's show the power of the product by building a little quiz.

Pop quiz!

Let's jump in the product and create a Pop quiz! It'll show off what the poduct is really good at; creating a questionnaire.

The Basics

Microsoft Forms logo
Microsoft Forms logo

So let’s find out what it can do and take a closer look to this new addition in the Office 365 family. First off, Microsoft Forms is a legit product within the Office 365 suite for education licensed users and therefore you start it, like any other product in the suite, from your app launcher.

Launching Microsoft Forms brings you to the My Forms overview, where all your Forms are shown, and a button to start creating a new one. My overview looks like this:

Forms
Forms

When you click the new button you a Form builder is loaded which allows you to enter a title, an introduction text and start adding questions. There are five types of questions that can be added to a Microsoft From.

Forms
Forms

First, we have the “choice” type which allows you to define a question and list a set of options that can be the answer. Unique to this the “choice” type is that you can use the “other” option if you want to provide a way for your form users to answer outside of the given options. This type of question is typically used for general information questions where there is no wrong answer like: “how did you find out this quiz?”.

Forms
Forms

Second, there is the “quiz” type which also works with defined answer options. Unique to this type is that there is actually a correct answer which can be set. Also you can provide feedback for each option to explain why an answer is correct or incorrect. The “quiz” type question is really the one that gives Microsoft Forms its educational flavor, because this is used to verify knowledge instead of gathering information.

Forms
Forms

Third, the “text” type for which the answer is given in a text box. Unique to this type is that there is an option to allow for a long answer, which gives the person taking the quiz a bigger textbox for the answer.

Forms
Forms

Fourth, we have the “rating” type which allows you to answer using a scale. This scale can be set to stars or numbers and can run from 1 to 5 or from 1 to 10. The “rating” question is often used in questionnaire to gather information about how the test subject agrees or disagrees with certain statements.

Forms
Forms

Fifth and last, there is the “date” type question for which the answer is given by selecting a date from the calendar. A date type answer is often seen in subscription or application forms or in questionnaires to ask about someone birthday for example. However, with a little creativity you can work this type of question into a quiz if the answer is a date (perfect for history exams) with is question like: “What was the founding date of Rapid Circle”.

Forms
Forms

Advanced options

For each of the five types of questions you can indicate if a question is required or optional. This is almost common practice with any type of question tool, but since it is such a powerful way to ensure data completeness I did not want to let this go unmentioned.

Also, for all types of questions you have the option to add subtitle. This could be used for providing a hint about the answer or giving guidance about how to answer the question.

“Choice” and “quiz” type question can be turned from single answer questions into multiple answer questions with a flick of a switch. However, the way that Microsoft Forms is letting the user know that multiple answers are possible is very subtle. For single answer questions the option selection boxes are round and for multiple answer questions the option selection boxes are square. So when making a multiple answers question, I would definitely recommend putting something like “(multiple answers possible)” into the question. Otherwise you well surely get complaints from your quiz takers.

Also for “choice” and “quiz” type of questions you can select the setting to shuffle the options. This will present the answers in a different order every time the quiz is loaded, which has several advantages. One, and I know you are thinking the same, it makes it harder to cheat. Two, when looking beyond the possible bad behavior of quiz takers, there has been a lot of research on how the order in which options are presented influences the option that is most likely to be chosen by quiz takers or the most likely to be correct. So if you as a quiz creator want to remove this bias, shuffling the answers is a nice option that helps you.

For the “text” type question, it is possible to provide restrictions. For example that the answer should be a number (nice for math problems) or that the answer should be between two values. All the restriction options are number based restrictions, so they actually help you to turn the “text” type question into a sixth type of question, namely the “number” type.

For the form as whole there are also some additional settings that can be turned on or off. For example, you can choose if you want to apply a deadline or if you want to shuffle the questions.

Forms
Forms

Sending out the Quiz

When you are done creating the quiz there are several ways to send out word about your newly created quiz. Obviously you can share the link by copying and pasting it to a certain location or email the link.

But next to that, Microsoft shows a nice realization of their mobile first strategy by allowing you to create a QR code for your quiz so people can scan it with their smartphone. Of course we did a test among colleagues, and it worked liked a charm. This function is especially interesting when promoting a training or event for which users need to subscribe. On the poster or flyer you can easily include the QR code so people walking by can scan it and immediately subscribe.

The last way to offer your quiz to users is by embedding it onto a webpage. This could be a SharePoint page, but any other webpage will do as well.

Forms
Forms
Forms QR
Forms QR

When spreading the word about your quiz, questionnaire or subscription form you can still control who can fill it. While the options are not very extensive (to say the least) the most important choice is available, which is to allow people outside your organization to fill out the form.

Forms
Forms

Feedback to the User

When someone fills out you Microsoft Form they get a piece of feedback after submitting. Next to the standard messages that thank the user for submitting and verifying that the form was submitted successfully, extra feedback is given when “quiz” type questions are incorporated in your form.

First, as discussed, a “quiz” type question offers the option to provide a comment per answer option which is shown after submitting the form. Second, in the advanced settings you can determine if the user should see the correct answer for a “quiz” type question after submitting. And Third, a user score is calculated based on the amount of “quiz” type questions they have answered correctly.

This last one is a bit tricky because it only looks at the “quiz” type questions in the form. So if you have a form with 8 questions and 4 of them are “quiz” type question, then the maximum score a person can get based on the feedback is 4 out of 4. From a technology point of view it makes sense, because for the “choice”, “text”, “rating” and “date” type questions you cannot indicated what the correct answer is so it just ignores those questions. But from a user experience it is pretty weird if you just answered 8 questions and you see that your score is 3 out of 4. And since there is no option to switch off this feedback about the user’s score, this definitely takes some communication effort to avoid confusion or complaint. So I would advise you to add a note covering this in the description text at the top of the form.

The responses

If you did a good job building and sharing your Microsoft Form, you will have plenty of responses in no time, which are automatically analyzed for you in the responses section of your form. Here you will find some statistics about the form as a whole and more detailed statistics about each individual question.

Forms
Forms

I have to say that the automatic statistics that are generated are quite good and cover the basic requirements around insight in your responses. But before we go into detail, I would like to point you to the “Open in Excel” button at the top right hand side which will allow to completely go berserk in analyzing the responses in your own way.

Forms excel
Forms excel

For “choice” and “quiz” type questions the responses are presented in a table like fashion as well as a chart. For “text” and “date” type questions the number of responses are presented along with the last three responses. And for “rating” type questions the number of responses is shown together with the average rating.

And for each question you have to option to click the “Details” button which shows all the responses for that particular question in a dialog box.

Forms
Forms

Final thoughts on Microsoft Forms preview

Microsoft Forms is a very complete quiz tool that will help you to create quizzes, questionnaires and simple subscription forms in a quick and easy way. Especially for a product which still is in Preview, I have to say that this first version already covers a lot of requirements. However, there are two major points of critique when looking at Microsoft Forms.

First, the name. It is very misleading in the sense that it brings high expectations to anyone who knows about the fact that InfoPath will be leaving us in the future. Because if you review Microsoft Forms Preview from the perspective of it replacing InfoPath, then you will be very disappointed.

Second, the audience. Microsoft offers the Preview exclusively to Office 365 education licensed users, while this product can also be very helpful outside the educational realm. Many corporations, government bodies and non-profit organizations could use this product. Creating a quiz for your internal training programs, making a questionnaire for customer satisfaction research or building subscription forms for an event is daily business for any type of organization and therefore the restriction to only offer this product to the educational market seems like a strange strategy. It even feels unfair for non-education licensed users. Logically, there are many many people lobbying to bring Microsoft Forms to all Office 365 users when it becomes Generally Available and I am one of them.

So Microsoft Forms shows to be a promising tool for creating quizzes, questionnaires and subscription forms. It covers the basics and in 90 percent of cases will do just fine. But it is not the long awaited replacement of InfoPath, so that remains on the wish list, and will live a life in the shadows of the Office 365 suite if it remains to be solely targeted at education licensed users.

FAQ

How can I get Office Forms Preview?

Sign up to gain access to the preview via https://forms.office.com. Unfortunately it's only available right now for Office 365 Education and the US market. If you are outside the US, but do have access to an Office 365 Education tenant. Sign up, but fill out an US address.

Will it only be available for Education tenants?

At the moment it's only available for Education tenants. Microsoft is exploring all posibilities, but has nothing to share about that as of yet.

Will it be available in my region/language?

Yes, Microsoft Forms will be launched for all Office 365 Education regions and languages.

Is this the Infopath replacement?

You might think that when you read the product name, but... No, this isn't even close. Look at Microsoft PowerApps as the Infopath replacement

Is this the final product?

It's in preview with no live date set, so you may expect changes. These can be small and/or large. If you'd like you can contribute via the feedback button when you're using Office Forms or post your ideas and upvote others on the Office 365 uservoice (https://office365.uservoice.com/).

Is there a Microsoft Support article available?

Yes, use your favorite search engine or follow the link: Microsoft Support - What is Microsoft Forms?

The preview is available for US right now. Anything I should be aware of when outside the US, but still apply?

Yes, as it's running for US only right now, all data is stored in the Microsoft Data Centers in the US. So if you're in Europe for instance, the data entered in Microsoft Forms preview will be stored on US servers. This will be until the product becomes available for your region.

What will the future of development in SharePoint look like? Pretty much like it does now!

Managing the lifecycle of the components you build, deploy, operate and support on Office 365 is difficult. The platform is new, the cloud is new, and much of your classic DTAP model is simply no longer appropriate. We at Rapid Circle are committed, for some time now, as being a Microsoft Cloud company rather than just a SharePoint company and the recent developments around SharePoint announced at the Future of SharePoint event on May 4th continue to vindicate our stance.

Specifically, the announcement of an upcoming SharePoint Framework (SPFx) makes it clear that client-side development and the use of JavaScript are encouraged and promoted for custom solutions on Office 365!

Some key aspects relevant for developers and administrators that we found very interesting were

  • The framework is fully JavaScript based model.
  • There is no single JavaScript framework mandated. We can still use popular frameworks such as Angular, Knockout and Handlebars.
  • Node.js & Gulp tasks are used for packaging deploying components.
  • The Local development model is going to be very different. A SharePoint workbench is introduced where Gulp and node.js will used to host files locally, so you don’t need to use IIS on your local machine.
  • Visual Studio Code, an open source code editor is being promoted as the preferred tool. Also clearly indicating that Visual Studio is not a must-have requirement for developers.

It's important to note that SPFx not a radically new model. The framework might be new, but we've been doing client-side development for some years now using similar frameworks. Finally though, Microsoft is creating this framework which leverages techniques we already use such as CSOM, REST API's. Taking it a step further Microsoft is openly embracing open source technologies such as node.js, Gulp, Yeoman and more.

Our Rapid ALM tooling is very much aligned to what Microsoft has described in the event. It’s a development & delivery model that’s entirely JavaScript based and enables us to streamline everything from the core development of our Instant Intranet components, integrated with our test teams, and deploying to our client tenants. Furthermore, it’s built on and resides in the SharPoint online platform.

Using Rapid ALM to evolve our Instant Intranet solution we are also “early adopters" for Visual Studio Code as the default editor for our apps development where we use Angular 1.4, Require JS. Moreover, we utilize NPM and Gulp for building, packaging and deploying our Instant intranet apps. Finally we use Git for source code control & versioning.

Future of SharePoint Development model
Future of SharePoint Development model

With this development model, we have successfully created and setup intranet for customers in the first quarter of 2016.  This new development model in SharePoint is giving us the platform for Rapid and Instant service to our customers.

And we're looking forward to the upcoming SharePoint Framework and will continue to evolve & enhance our own to software development lifecycle!

This blog post is part of the series Future of SharePoint. More on this topic can be found at http://08b.4d7.myftpupload.com/tag/FutureOfSharePoint/

A Preview of Microsoft Flow: the Pro's and Con's

1. What’s Flow?

Microsoft Flow is a new product that allows the creating of cross application action-reaction scenarios which was announced at the keynote of the Future of SharePoint. Flow allows you to define what should happen if a certain event occurs, also known as the “IF This, Then That” (IFTTT) [1] model. It is currently in preview and at https://flow.micosoft.com you can sign up and get started on creating your own flows.

The whole premise is that you pinpoint the events that are important for you and decide what should happen if such an event occurs. To name just an example: let’s imagine that you are a project manager for a top priority project and you want to keep a close ear to what people say about this project. You could create a Flow that keeps track of Yammer and creates an entry every time someone mentions your project on Yammer. With Flow this is possible.

2. So the new Workflows basically?

Officially Microsoft has not yet stated whether or not Flow is going to replace SharePoint Designer workflows. But as we all know, SharePoint Designer 2013 (because SharePoint Designer 2016 was an exact copy of SharePoint Designer 2013) was the last release of SharePoint Designer we can expect from Microsoft and with that we will say goodbye to workflows as well. This makes Flow the prime candidate to replace SharePoint Designer workflows. What I did find in the communication is that the product team will focus on “…adding ways to leverage your existing SharePoint workflows in Microsoft Flow.”[2]

I personally am a big fan of SharePoint Designer workflows, so when getting my hands on the Microsoft Flow Preview I could not help comparing it with the possibilities that SharePoint Designer workflows bring me.

3. What are the Pro’s

Flow should not be simply seen as a newer version of workflows, because it is a whole other platform and approach to support “if this, then that” (IFTTT) [3] logic. So what makes Flow great?

Cross site workflows

SharePoint Designer workflows only operate within the borders of a site. So all lists and libraries that are involved in your logic process need to be in 1 site. With Flow it does not matter where your items are stored on SharePoint. It can trigger the creation of an Announcement of the Management Team site because a deal was completed in the prospects list of the sales department site. It even does not matter if these site reside in the same site collection or the same tenant.

Flow1
Flow1

Cross application workflows

The cross platform approach of Flow even goes way beyond the borders of SharePoint by allowing you to tap into all sorts of different applications. If we go back to our previous example of the project manager keeping his ears open to the conversations on Yammer, we could expand that scenario by including other social platforms like Twitter and Facebook. So whenever the project is mentioned on either Yammer, Twitter or Facebook we record an entry in a SharePoint list. Or maybe we do not want to use a SharePoint list, but log everything in an excel file so that we can crunch the numbers later and do some slicing and dicing analysis on the mood around this very important project.

With Flow all the above mentioned application can be linked to each other and used for your IFTTT scenarios. Below you find an overview of applications that are currently included in my preview version of Microsoft Flow.

Flow2
Flow2

Recurrence

Everybody who has experience with SharePoint Designer workflows will have had a firsthand taste of how difficult it is to add recurrence to your workflow. Starting a workflow on a certain date or time required making use of retention policies and making a workflow recur every hour, week or month meant pulling out a whole bag of tricks. But not anymore with Flow, because you get recurrence out of the box which allows you to run your Flow every day, hour, minute or even second!

Thinking of our example case. We could add a recurrent step that sends out the mood analysis about the project every week.

FLow3
FLow3

User Profile Lookup

Flow allows you to lookup a User Profile as one of the action. This can be your own profile, a specific user profile or the profile of a user based on a search. Even getting the profile of someone’s manager is no problem, which is a real help in approval flows.

With looking up the profile, you also get access to the field that are in the profile. And since you can add fields to the Office 365 profile of your users which make sense for your organization, this means that you can use those organization specific profile fields in your Flow. Thinking back to our example, we could lookup certain details about the people that mention the project, for example the department they work for or the country they are based in. This could increase our insight in which parts of the organization the project is on the agenda.

Flow4
Flow4

Templates

Where SharePoint Designer Workflows came with a set of predefined templates that you could use straight out of the box, Flow goes the distance when you consider the amount of templates that are offered[4]. Also, the set of templates is growing every day, because you can choose to share your self-created flow with the community.

4. What are the Con’s?

As said, Flow is in premium state and thus still being worked on. Therefore, it is expected to have some bits and pieces missing. So what are the things that are not so great about Flow?

Everything is personal

One of the first things that I noticed is that, because Flow spans its logic across many applications, I was entering a lot of usernames and password to prove that I had the correct credentials for all these applications. Whenever you enter credentials, a connection is added to your Flow account in order to allows you to run the steps in your Flow that access that particular application. And you can manage these connections your personal connections overview.

Flow6
Flow6

This however begs the very important question: Is every Flow and impersonation workflow? And while I could not find a definite answer in the documentation [5], I cannot see any other conclusion than that a Flow you create only works based on the connections you have defined. This means when you build a Flow that start when an item is modified in a certain list, that anyone who has the permission to modify items in that list can trigger the Flow to start and will make use of your connections to go through all the defined steps. So any item that is modified, any item that is created, any email that is sent is sent using your credentials.

This is a very big deal and requires serious thought before activating a Flow and letting it run on a library or list. As highlighted in the introduction of this blog, Flow helps you to organize

Start other flows

It is not possible to start a second Flow as an action of your primary Flow. While this option brought much happiness for the users of SharePoint Designer 2013 workflows, it is not included in the set of actions in Flow. Which is a pain, because this will again lead to the situation that every single variation and exception within your logic process has to be included in the same Flow. This makes building and maintaining Flows unnecessarily complex.

Reorder steps

I almost could not believe it, but I cannot rearrange the steps when building the flow. When you start building you have a first step and a plus sign beneath it. But after adding three or four steps I could not squeeze another in between the ones I already had. This would mean that each time you want to go back and add a step in between others, you would have to delete everything first, add it, and then create the deleted steps again. I dare to say that this is not just user unfriendly, it is just ridicules.

And/Or conditions

In Flow you still rely on Conditions and Actions to create your Flow. But the options you have with formulating your conditions are greatly reduced compared to what you are used to in SharePoint Designer, where putting multiple conditions together immediately results in a AND or OR logic. In Flow this is not that easy. If you put two conditions beneath each other you have to define actions in between, so no possibility to define a OR scenario. And the AND scenarios are only possible by putting a whole series of conditions beneath each other and leaving out an action for the “If yes” path.

Flow7
Flow7

For conditions you do have the option to move into advanced mode within Flow, but that requires you to learn the new type of syntax building to create your AND or OR condition. Oh, and when you choose to go into advanced mode, there is no turning back to the “not advanced mode”. Very user unfriendly to say the least.

Flow8
Flow8

The verdict

Microsoft Flow really is one of the game changing components of the Future of SharePoint announcement. While it is very tempting to compare it to SharePoint Designers workflows, it actually requires a whole different approach because of the cross platform possibilities, deep integration with office products and personalization.

The product on the one hand shows great potential but on the other hand still shows clear signs that it is merely in Preview. If I had to make up the score right now, I would already conclude that the good outweighs the bad, so I am very eager to see what will be added to Flow in the future.

I will definitely keep a close eye on the roadmap to see what is coming and advise everybody to do the same, because for me, Flow is absolutely part of the Future!

This blog post is part of the series Future of SharePoint. More on this topic can be found at http://08b.4d7.myftpupload.com/tag/FutureOfSharePoint/

Sources:

  1. IFTTT
  2. Flow Microsoft
  3. IFTTT
  4. Flow Microsoft Templates
  5. Flow Microsoft Documentation

Watch the keynote "The Future of SharePoint" here.

Lots of changes and innovations have been announced at the 'Future of SharePoint' conference last month. We've watched the keynote and found it very interesting. We think you might find this interesting as well. So below, find the full keynote!

Full keynote presentation "Future of SharePoint"

Want to know more?

Read more in-depth information in other blogs/articles by myself and my colleagues: List of #FutureOfSharePoint posts

This blog post is part of the series Future of SharePoint. More on this topic can be found athttp://08b.4d7.myftpupload.com/tag/FutureOfSharePoint/

Forms and SharePoint: Excellent question! 

The new kid in town

There is a brand new feature in Excel which allows you make good looking surveys superfast. And this new addition to your favorite workbook tool is placed front and center in your OneDrive (in my case the OneDrive for Business) and it will appear in your SharePoint document library as soon as you switch to the new look and feel (Read more about the new experience in document libraries here). When selecting new the Excel Survey pops up in the drop down and as soon as you create one, you quickly enter a name for your document and then you are building your survey straight away.

How does it work?

When you create an Excel survey you basically create a workbook and a nice interface for data entry. This was always possible for the Excel experts among us, but it is readily available out of the box to all and way faster than building it yourself in an ordinary workbook. The survey builder lets you pick a title and a description (or delete the placeholder if your survey doesn’t need them) and then you start defining the questions.

Forms and SharePoint: Excel Survey Blog
Forms and SharePoint: Excel Survey Blog

Every question has an additional settings menu where you can define the question, the subtitle, the response type, if it is required or not and a default answer. And off course you will find the add and delete buttons in this panel as you are used from Microsoft. There is no validation (a part from making questions required) and no special formatting or anything, just plain and simple, and above all superfast, survey building. And when I say fast I mean fast. The survey that I show below contains a title, description, five questions and was built in under ten minutes. And that was timed including the time to take screenshots of every step.

Once you are done with building your survey you simply hit the “Save and View” button to get a glance of how the survey will look to the people who you will be asking to take it. And if you like what you see you simply hit the “Share Survey” button and a link is generated that you can send to your audience.

What has happened in the background is that for each question that you created a column was made in your workbook. This is where the response will be stored of the people taking your survey. And the columns have properties that match the kind of questions they are linked to. So a text column is made to store the answers for a test question, a number column is made for a number question and a choice column is made for a choice column. Again, I not saying that this was not possible before in Excel, but this just makes it so much easier.

In light of being totally honest, I will add in the note that Microsoft also includes on their support page, which is that “Columns in the spreadsheet are built as you add questions to the survey form. Changes you make to the survey form are updated in the spreadsheet, unless you delete a question or change the order of questions on the form. You'll have to update the spreadsheet manually in those cases: delete the columns that go with the questions you deleted, or cut and paste columns to change their order.” (from: Office Support)

Why should I use Excel surveys?

For me this new feature really shows that Microsoft has a sense of what their customers are doing with their product. Because as stated earlier, building surveys in Excel has been done before. And also for other survey tools that Microsoft brought to life, for example the SharePoint Survey App, the data is stored in a list and usually analyzed in Excel. So building a survey feature into the product of Excel makes sense.

And there are some additional benefits next to the fact that your survey building time will become far shorter with this new feature. First, as explained, you only share a link with your audience. This means that you do not have to give the people who fill out the survey, access to the data in the workbook. This separation of data entry and data storage fits the security driven world we act in today. Second, since you share a link to a webpage with your audience, the survey is easily accessible from any device (desktop, laptop, tablet, mobile, etc.) as long as you have a browser and an internet connection. A colleague of mine opened the email I sent out on his phone and could fill in the survey straight away.

If have to put the excel survey in line with the other survey tools that Microsoft offers, I would put it as an equal weight and possible replacement of the SharePoint survey app. For the quick and dirty poll, you can use a third party poll app or even the Outlook voting buttons and for the structured business processes you can use InfoPath forms or a third party form builder like Nintex. But Excel surveys fits nicely in between to fit scenarios where you do have multiple questions to ask but it is an ad hoc or onetime thing that doesn’t need or justify developing a custom form.

This blog post is part of the series Forms and SharePoint. More on this Topic can be found HERE

Forms and SharePoint: InfoPath is here to stay

light-man-hand-pen.jpeg

Forms? Why should I care?

For most people the word “forms” brings to mind bad memories about bureaucratic procedures with lengthy enrollment sheets that require you to enter data that you know for certain you have entered before. I know of these procedures, been forced to go through a whole bunch of them and I too can think of at least one form that made me give up to even start the process it was for. But instead of yelling out “destroy, destroy!” whenever I come across a form I am still a huge fan.

The reason for my everlasting love is simple: garbage in is garbage out to any process and forms provide me with the weapon to ensure data quality on the point of entrance. This focus on data quality was taught to me at a very early stage in my career in light of the single source of truth principle along with the master data management mantra: “Create once, validate once, use many”. A line that is among my favorite quotes from the day that I first heard it.

But hasn’t InfoPath left the building already?

Well actually, no. There has been a lot of talk all over the place, including official statements from Microsoft itself, that InfoPath is on the way out. And while these statements where very strong and decisive in 2014 when Microsoft first came forward with the topic, “…there will not be a new version of InfoPath and support will continue through April 2013…[1], the 2016 release officially included InfoPath which means that by the rules of the Lifecycle Support Model support is continued until 2021 (2026 with extended support)[2]. And Microsoft themselves have updated their earlier strongly formulated statement about InfoPath with a friendlier “…InfoPath Forms Services will be included in the next on-premises release of SharePoint Server 2016, as well as being fully supported in Office 365 until further notice. Customers will be able to confidently migrate to SharePoint Server 2016 knowing that their InfoPath forms will continue to work in their on-premises environments, as well as in Office 365.[3]

And while Microsoft keeps saying that InfoPath 2013 will be the last version of InfoPath that will be released, I see no immediate reason to move away from a tool that works and will remain to work in your latest on-premises and online environments.

So what is it good for?

To possibly better understand my enthusiasm when it comes to forms, it is probably valuable to share that I am a process minded guy with a strong background in Operational Excellence. And thus I know all too well that a good process does not allow for defects to be created along the way. Because a defect either results in rework or scrap and both are a waste of time.

I am such a big fan of forms, because they allow me to eliminate defects at the very beginning. And InfoPath gives me a big box of tricks to enforce data quality at the entry point. To list my favorite ones:

Data validation

We have all seen form entries which contradict themselves. Sometimes it is an honest mistake and sometimes it is unthinkable stupidity, but it is always annoying. With data validation you can add rules that check if people are making mistakes while entering the form, for example by forcing that the return date of a reservation is later in time than the indicated starting date. And with the ability to include multiple fields in one rule, even the most completed scenarios (if A is higher than B and C is less than D than E cannot be higher that B plus C and not lower than D minus A) can be validated. And let’s face it, we can all name a process that has these kinds of if’s, then’s and but’s.

Formatting

Who hasn’t come across the quote “If you have answered “No” to this question, please continue to question number...”? And off course, it is always nice to be able to skip a question, but wouldn’t it be even better if the questions you do not need to answer are not even asked in the first place? With formatting this is possible, because based on given values you can determine to show a field or even a whole section of fields. So only those questions that need to be asked based on the previous answers will be shown. This also gives the user the feeling that the form is designed especially for him or her, which always wins you sympathy points.

Querying for data

One other great way to enhance data quality and reduce effort for your users is being able to query for data. The scenario comes to mind of an ordering form, which always need to contain information about a product and a customer. You know this information is available somewhere in the organization already and probably stored in a very structured manner. So why not tap into this source. By providing the product code in the form you can fire of a query that gets all the additional information about that product that you need. This has two big advantages: 1) as a user, I only need to type in the product code and 2) as an administrator I only have to keep the source list up to date to ensure data quality for every entered product code.

Cascaded dropdowns

As a user, I always like when a form thinks along with me. Take for example the scenario where I have to provide a shipping location by filling out country, city and building. As soon as I have provided the country I would appreciate it when I can then only pick cities that are within that country. And that as soon as I have picked the city, I only get buildings within that city. With cascading dropdowns this is possible and again it is a way to at the same time enhance data quality and provide user comfort.

Calculated fields

Calculated fields can be of use in two different ways. The first one is obvious because it is right there in the name: to perform a calculation. This can be anything from a simple total order amount calculating based on the quantity and the price to a complex calculation of break-even revenue based on the price, sales per hour, salary per hour, rent per day and overhead cost percentage. Incorporating any formula into your form again enhances data quality and provides user comfort.

The second way that calculated fields can help you in InfoPath is to provide feedback to your user. Take the previous scenario of retrieving data about a product based on the product code. It is not a bad idea to show the user all the information that was gathered based on the entered product code so he or she can check if that is indeed the product information that is needed. However, you also do not want to allow the user to change this info. If there is indeed a fault in the data, then the source needs to be adjusted to solve it for everyone. The nice thing about a calculated field, as you also know from SharePoint lists, is that they cannot be edited. So if we provide the retrieved info in calculated fields, the user gets their feedback and the admin maintain control over the data quality. Again a win-win scenario.

Infopath

So what’s the downside?

Off course InfoPath is not perfect and there will be limits to what you can do with it, but from my experience the vast majority of business processes can the greatly streamlined by pouring the data entry into smart forms build in InfoPath. And if you come across that process or trick that just isn’t doable to dummy proof it with a good form, then please ask yourself this question before going out and buying something else: “does my process really need to be this complex?”. But that’s the Operational Excellence guy in me talking.

One true disadvantage of InfoPath is that it is not included in the standard SharePoint Server offering. You have to go Enterprise[4] to get it and that is a pretty big difference in cost. And while everything has its price tag, you will to sit down and do your homework before acquiring InfoPath from an investment point of view. It will take a cost benefit analysis over multiple areas to weigh of efficiency gains to license costs before you know is the investment of upgrading from a Standard Server license to an Enterprise Server license will pay off.

However, the Office 365 environment offers multiple possibilities to go about your licensing in a whole different way that could significantly reduce the total investment needed. Plus, you get the flexibility of scaling up and down in the cloud. Our licensing experts can certainly help you figure out what would be the best plan for your organization with respect to InfoPath licensing.

Another often heard critique about forms is that they are ugly and you cannot do a lot in terms of design improvement. While InfoPath certainly is built with function over beauty in mind, there are still many possibilities to enhance the look of your forms. I would put it more like this: if you can manage to create nice excel sheets and word document, you will have the tools to present a good looking form.

To make a long story short

InfoPath is definitely not out the door yet. As Microsoft promises its users, it will be included in the latest online and on premise offerings for SharePoint and supported throughout 2026. So if you have InfoPath at your fingertips right now, use it! Build those forms and make your processes more robust and fool proof. The investment will pay itself back in the coming decade and the experience you gain from digging into the details of your processes and determining what piece of information is needed when and what is the best source to retrieve it from will be valuable forever. Because you will need to go through the same steps when building your forms in any other tool.

 

This blog post is part of the series Forms and SharePoint. More on this Topic can be found at http://08b.4d7.myftpupload.com/tag/FormsAndSharePoint/

New Experience Sharepoint Online Library

The experience of SharePoint libraries is changing

Micrsoft is pushing update MC44849 which changes the user experience of libraries in SharePoint Online. They've communicated via the Message center and an article on Microsoft Office 365 Support. We've been expecting this update. For a while now we've grown fond of the new experience of the OneDrive for Business look & feel. It was to expect that this was the testing ground for the updating all SharePoint libraries to this new experience. That time is now.

Has my tenant been updated?

There are 2 ways to check whether your tenant has been updated. 1. The new experience setting has been added to the SharePoint Admin Center. Go to your SharePoint Admin Center and choose Settings. If your tenant has been updated, you should now have an extra setting: SharePoint Lists and Libraries experience.

Tenant settings new experience
Tenant settings new experience

2. When you load a library in SharePoint Online and there's a big notification banner waiting for the user. It tells "Document Libraries are getting a new look!". When this is shown, the update has been deployed.

Document Libraries are getting a new look!Document Libraries are getting a new look!
Document Libraries are getting a new look!Document Libraries are getting a new look!

Video overview

View all the changes and new functionality in this short walk through video (3:30min) about the new experience for SharePoint libraries.

Most important changes

These are the most important changes that were deployed which have a big impact on the interaction and functionality of the SharePoint library;

nieuwe ervaring - new item

- UI

If an library is displayed, the library is loaded in a screen similar to the default OneDrive for Business style (blue/white). It feels like you've left the (sub)site and are looking at the library in a different app/site. This can be confusing at first for users if they weren't informed of this change. A way to explain this, is by disconnecting the documents "source"(the library) of the site with the list view web part. The list view web part is a window to the source and the user opens the source which is presented in a different way.

- Upload

The functionality of the upload doesn't change, via drag-and-drop or through the menu option, but the ability to upload complete folders and files is new. Microsoft has learned that a lot of users were still using the classic view of OneDrive for Businees, so they could use the "Open in Explorer" function and upload complete folders. So it makes sense they have included this functionality.

- Navigation

As said, the library is displayed in it's own environment, almost looking like a separate Office 365 App/Add-in. The top bar/global navigation have been removed, together with the breadcrumb and any custom styling/master page. But the quick launch/current navigation still remain on the left. That is the user's way out/back.

- Link as contenttype

nieuwe ervaring - information pane

With the update comes the possibility for a user to add a link as an item (*.url file). It's now a default item in the "new item" menu. This means an user can add a link to a file/site/etc and place it in the library. This helps in the battle to eliminate duplicate files which can run out of sync.

- Information pane

The live preview of a file and the display of a few properties have been moved from the item's context menu to the information pane, which slides out on the right when the "I" (info) icon is clicked. This pane shows the library's properties by default. When a file is selected, it shows the live preview of the file, the properties of the file (which can be edited inline), sharing options, version history and the file properties. Very usefull.

- Spotlight/Pin

The user can now pin files and put them in the spotlight. It's like a featured image or document. If a file is pinned, a banner is added on top where the pinned files are shown as big live preview tiles. Add files there which are favorites, featured and/or important.

nieuwe ervaring - gallery view

FAQ

Have we a lost all navigation? Not quite. The quick launch / current navigation is still present on the left side. That's the ticket back to the site. Further more there is still the Office 365 app launcher.

Can the new experience be disabled? Yes. Via the SharePoint Admin Center you can disable this. go to the SharePoint Admin Center en choose Settings (direct link: https://[DOMAIN]-admin.sharepoint.com/_layouts/15/online/TenantSettings.aspx). Look for the sub header "SharePoint Lists and Libraries experience" and choose the "classic experience". This sets all document libraries to classic experience for the whole tenant.

Can I prevent the update from deploying on my tenant? No. You could delay updates by switching to a different service model. This will delay the deploy of updates with several months. But this also means that all other updates will be delayed as well. And you might not want to do so. Read more about service models on Technet: Change management for Office 365 clients.

Why is the new experience option enabled by default? Microsoft is innovating and wants you to join and make use of new features. This will at least stimulate users to take a look at first, while still giving the option (for now) to revert back to the classic experience. This stimulates usage and makes us think about a strategy on how to implement and communicate the new features.

Want to know more? Contact us at Mark.Overdijk@RapidCircle.com or Support@RapidCircle.com.

An ROI of 162%: the business case for Office 365.

“What are the advantages, what are the gains, what are the costs, what are the risks and how flexible are we with Office 365?” In other words, what is the business case? This is a question that we are often asked. Research shows that it is now possible to get 162% back of your investment.  

As requested by Microsoft, Forrester Consulting studied the ‘Total Economic Impact” (TEI) of Office 365. The TEI framework is a means to weigh the costs, advantages, flexibility and risks of an investment. In Forrester’s research, both qualitative and quantitative methods have been conducted by closely looking at a number of cases and executing an extensive survey.

The conclusion of this research is that an investment in Office 365 absolutely pays off. An ROI of 162%, an internal return rate of 468% and a reimbursement term of 7 months; those are the advantages that come with an investment in Office 365.

Forrester researches the economic impact of Office 365 in five fields, namely: technology, mobility, control & compliance, Business Intelligence and Enterprise Social. Below we provide a preview of the results of the research:

  • Among other things, the research shows that Office 365 offers advantages related to technology, because organisations do not need to build a new hardware infrastructure, and less manpower is needed to keep the tool working. The research also shows that after three years employees will have an extra hour to spend per day because they are working more efficient in Office 365. They can use this extra time for the benefit of your organisation.
  • Examples of costs that have to be calculated when investing in Office 365 are costs for implementation, trainings, continuous system administration and licences.
  • The question of whether the use of Office 365 results in more flexibility is answered with a convincing Yes. One of the interviewees stated that ‘It has given us tremendous agility’.
  • There are also risks. An example is the risk of implementation. This risk means that it is possible that the final product deviates from the expected requirements. However, according to Rapid Circle this can be solved easily by implementing Office 365 in an agile way. By testing the requirements continuously, working iteratively and not fixing the end product from the start of the project, the implementation can always be adjusted to the requirements of that moment.

 

Would you like to know exactly how your organisation can benefit from Office 365, what the costs are, how it will keep your organisation agile and what the risks are? Download the research report by filling in the form below.

Would you like discuss the options of Office 365 for your organisation? Please contact us!

[email-download download_id="8911" contact_form_id="8846"].

 

Powershell: Publishing all files in a SharePoint Online library programmatically

One of our clients build up a library of 500+ documents. After these were modified (added meta data and the content went through several rounds of corrections), we were asked to mass publish all files so the site could go live. Which leaves us with 2 options; 1. manually check-in, publish, approve all files. 2. add some CSOM & PowerShell together in a file and do it programmatically. Off course, I, Mark Overdijk, chose to persue the second option. I asked Massimo Prota to assist in getting a script ready. The first version of the script turned out rather usefull, so I added some extra features and more out-put to & interaction with the user. This latest version is generic enough so it can be re-used.

Features

- No limitation on # of files for a list/library - Added code to filter which files should be published - User will be promoted for password and confirmation - Feedback to user on screen - If a file is checked out, the script will check in before proceeding - If Content Approval is enabled for the list/library, the script will approve the file - Screen out-put will be saved to a txt file which includes the current date/time in the filename

Prerequisites Powershell

Step 1. Gather parameters

For the script to run properly, you'll need the following parameters;

.SiteUrl: This is the full URL to the (sub)site where the list is stored for which you want to publish/approve the files

.ListName: This is the Title of the list for which you want to publish/approve the files

.UserName: This is the UserName that has enough permissions to publish/approve the files

Step 2. Run PowerShell script

Start Windows Powershell as administrator.

Be sure to first set the ExecutionPolicy correctly so you are able to run scripts. Set-ExecutionPolicy Unrestricted [ENTER] Input "A" for all After the ExecutionPolicy is set, we can run the script file.

[code language="powershell"] #################################### # Script: PublishFilesSPO.ps1 # # Version: 2.0 # # Rapid Circle (c) 2016 # # by Mark Overdijk & Massimo Prota # ####################################

# Clear the screen Clear-Host

# Add Wave16 references to SharePoint client assemblies - required for CSOM Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.dll" Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.Runtime.dll"

# Parameters # Specify the subsite URL where the list/library resides $SiteUrl = "https://DOMAIN.sharepoint.com/SUBSITE" # Title of the List/Library $ListName = "TITLE" # Username with sufficient publish/approve permissions $UserName = "USER@DOMAIN.com" # User will be prompted for password

# Set Transcript file name $Now = Get-date -UFormat %Y%m%d_%H%M%S $File = "PublishFilesSPO_$Now.txt" #Start Transcript Start-Transcript -path $File | out-null

# Display the data to the user Write-Host "/// Values entered for use in script ///" -foregroundcolor cyan Write-Host "Site: " -foregroundcolor white -nonewline; Write-Host $SiteUrl -foregroundcolor green Write-Host "List name: " -foregroundcolor white -nonewline; Write-Host $ListName -foregroundcolor green Write-Host "Useraccount: " -foregroundcolor white -nonewline; Write-Host $UserName -foregroundcolor green # Prompt User for Password $SecurePassword = Read-Host -Prompt "Password" -AsSecureString Write-Host "All files in " -foregroundcolor white -nonewline; Write-Host $ListName -foregroundcolor green -nonewline; Write-Host " on site " -foregroundcolor white -nonewline; Write-Host $SiteUrl -foregroundcolor green -nonewline; Write-Host " will be published by UserName " -foregroundcolor white -nonewline; Write-Host $UserName -foregroundcolor green Write-Host " "

# Prompt to confirm Write-Host "Are these values correct? (Y/N) " -foregroundcolor yellow -nonewline; $confirmation = Read-Host

# Run script when user confirms if ($confirmation -eq 'y') {

# Bind to site collection $Context = New-Object Microsoft.SharePoint.Client.ClientContext($SiteUrl) $credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($UserName, $SecurePassword) $Context.Credentials = $credentials

# Bind to list $list = $Context.Web.Lists.GetByTitle($ListName) # Query for All Items $query = New-Object Microsoft.SharePoint.Client.CamlQuery $query.ViewXml = " " $collListItem = $list.GetItems([Microsoft.SharePoint.Client.CamlQuery]::CreateAllItemsQuery()) $Context.Load($List) $Context.Load($collListItem) $Context.ExecuteQuery()

# Go through process for all items foreach ($ListItem in $collListItem){ # Adding spacer Write-Host " " Write-Host "/////////////////////////////////////////////////////////////" Write-Host " " # Write the Item ID, the FileName and the Modified date for each items which is will be published Write-Host "Working on file: " -foregroundcolor yellow -nonewline; Write-Host $ListItem.Id, $ListItem["FileLeafRef"], $ListItem["Modified"]

# Un-comment below "if" when you want to add a filter which files will be published # Fill out the details which files should be skipped. Example will skip all files which where modifed last < 31-jan-2015 # # if ( # $ListItem["Modified"] -lt "01/31/2015 00:00:00 AM"){ # Write-Host "This item was last modified before January 31st 2015" -foregroundcolor red # Write-Host "Skip file" -foregroundcolor red # continue # }

# Check if file is checked out by checking if the "CheckedOut By" column does not equal empty if ($ListItem["CheckoutUser"] -ne $null){ # Item is not checked out, Check in process is applied Write-Host "File: " $ListItem["FileLeafRef"] "is checked out." -ForegroundColor Cyan $listItem.File.CheckIn("Auto check-in by PowerShell script", [Microsoft.SharePoint.Client.CheckinType]::MajorCheckIn) Write-Host "- File Checked in" -ForegroundColor Green } # Publishing the file Write-Host "Publishing file:" $ListItem["FileLeafRef"] -ForegroundColor Cyan $listItem.File.Publish("Auto publish by PowerShell script") Write-Host "- File Published" -ForegroundColor Green

# Check if the file is approved by checking if the "Approval status" column does not equal "0" (= Approved) if ($List.EnableModeration -eq $true){ # if Content Approval is enabled, the file will be approved if ($ListItem["_ModerationStatus"] -ne '0'){ # File is not approved, approval process is applied Write-Host "File:" $ListItem["FileLeafRef"] "needs approval" -ForegroundColor Cyan $listItem.File.Approve("Auto approval by PowerShell script") Write-Host "- File Approved" -ForegroundColor Green } else { Write-Host "- File has already been Approved" -ForegroundColor Green } } $Context.Load($listItem) $Context.ExecuteQuery() } # Adding footer Write-Host " " Write-Host "/////////////////////////////////////////////////////////////" Write-Host " " Write-Host "Script is done" -ForegroundColor Green Write-Host "Files have been published/approved" -ForegroundColor Green Write-Host "Thank you for using PublishFilesSPO.ps1 by Rapid Circle" -foregroundcolor cyan Write-Host " " } # Stop script when user doesn't confirm else { Write-Host " " Write-Host "Script cancelled by user" -foregroundcolor red Write-Host " " } Stop-Transcript | out-null ############################## # Rapid Circle # # http://08b.4d7.myftpupload.com # ############################## [/code]

The 9 secrets of facilitating internal news efficiently

Are you having trouble making everybody in your organisation read the internal news? Thats understandable! Did you know that on average only 32% of the recipients opens a digital newsletter? A large group in your organisation won't be reached. They do not see added value in reading the news letter. But why don't they?   

In order to help you reach the other 68% of the organisation with your internal news, we want to share our nine best tips with you in order to increase the reach and relevance of your newsletter. Things to consider are for instance:

  • Making it more personal
  • Making it more accessible
  • Making it easier

With the right tool it is quite simple to adhere to all the conditions. Do you want to increase the relevance and the reach of your internal news? Download the article 'The 9 secrets of facilitating internal news efficiently' without charge by filling out the form below.

[email-download download_id="8838" contact_form_id="8846"].

[/vc_row]

PowerShell: Terminate a workflow for all items in a list on SharePoint Online

This is a follow up on our previous post "PowerShell: Start a workflow for all items in a list on SharePoint Online". As it's great that now there's a script available to start a workflow for all items, it would also be great to have the ability to stop all workflows if necessary. So I, Mark Overdijk, got to work again with Massimo Prota to get this script in place. The script is very similar to the StartWorkflow Powershell script, but the difference is that we don't retrieve the workflow through WorkflowAssociations but we have to use WorkflowInstances.

Prerequisites Powershell

Step 1. Gather parameters

For the script to run properly, you'll need the following parameters;

.SiteUrl: This is the full URL to the (sub)site where the list is stored for which you want to run the workflow

.ListName: This is the Title of the list for which you want to run the workflow

.UserName: This is the UserName that has enough permissions to run the workflow

Step 2. run PowerShell script

Start Windows Powershell as administrator.

Be sure to first set the ExecutionPolicy correctly so you are able to run scripts. Set-ExecutionPolicy Unrestricted [ENTER] Input "A" for all After the ExecutionPolicy is set, we can run the script file.

Copy/Paste the code below in a txt file and save as an *.ps1 file (in this example "StopWorkflow.ps1"). Fill out the parameters with the gathered information and run the script.

PowerShell stop workflow
PowerShell stop workflow

[code language="powershell"] # Add Wave16 references to SharePoint client assemblies and authenticate to Office 365 site - required for CSOM Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.dll" Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.Runtime.dll" Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.WorkflowServices.dll"

# Specify tenant admin and site URL $SiteUrl = "https://[TENANT].sharepoint.com/" $ListName = "[TITLE OF THE LIST]" $UserName = "[USERNAME]" $SecurePassword = Read-Host -Prompt "Enter password" -AsSecureString

# Bind to site collection $ClientContext = New-Object Microsoft.SharePoint.Client.ClientContext($SiteUrl) $credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($UserName, $SecurePassword) $ClientContext.Credentials = $credentials $ClientContext.ExecuteQuery()

# Get List $List = $ClientContext.Web.Lists.GetByTitle($ListName)

$ClientContext.Load($List) $ClientContext.ExecuteQuery()

$ListItems = $List.GetItems([Microsoft.SharePoint.Client.CamlQuery]::CreateAllItemsQuery()) $ClientContext.Load($ListItems) $ClientContext.ExecuteQuery()

# Create WorkflowServicesManager instance $WorkflowServicesManager = New-Object Microsoft.SharePoint.Client.WorkflowServices.WorkflowServicesManager($ClientContext, $ClientContext.Web)

# Connect to WorkflowSubscriptionService $WorkflowSubscriptionService = $WorkflowServicesManager.GetWorkflowSubscriptionService()

# Connect WorkflowInstanceService instance $WorkflowInstanceService = $WorkflowServicesManager.GetWorkflowInstanceService()

$ClientContext.Load($WorkflowServicesManager) $ClientContext.Load($WorkflowSubscriptionService) $ClientContext.Load($WorkflowInstanceService) $ClientContext.ExecuteQuery()

# Get WorkflowAssociations with List $WorkflowAssociations = $WorkflowSubscriptionService.EnumerateSubscriptionsByList($List.Id) $ClientContext.Load($WorkflowAssociations) $ClientContext.ExecuteQuery()

# Prepare Terminate Workflow Payload $EmptyObject = New-Object System.Object $Dict = New-Object 'System.Collections.Generic.Dictionary[System.String,System.Object]'

# Loop Terminate Workflow For ($j=0; $j -lt $ListItems.Count; $j++){

$msg = [string]::Format("Killing workflows {0} on ListItemID {1}", $WorkflowAssociations[0].Name, $ListItems[$j].Id) Write-Host $msg

$itemWfInstances = $WorkflowInstanceService.EnumerateInstancesForListItem($List.Id, $ListItems[$j].Id) $ClientContext.Load($itemWfInstances) $ClientContext.ExecuteQuery() for ($k=0;$k -lt $itemWfInstances.Count;$k++) { try { $WorkflowInstanceService.TerminateWorkflow($itemWfInstances[$k]) $msg = "Worfklow terminated on " + $ListItems[$j].Id $ClientContext.ExecuteQuery() } catch { $msg = "Error terminating workflow on " + $ListItems[$j].Id + " Details: $_" }

Write-Host $msg } } [/code]

PowerShell: Start a workflow for all items in a list on SharePoint Online

For one of our Office 365 clients (mix of E1 and E3 licences) we created a workflow which will check the status of an item and, depending on this status, sends out e-mails and updates other columns. As the list was already in use, it was necessary to start the workflow for all present items. But to start the workflow manually for all 477  items, was not preferable. So I, Mark Overdijk, asked Massimo Prota to help me on the quest to see if it would be possible to to do it via PowerShell. As there are no PowerShell commands available for SharePoint Online to access the workflow instance, we searched for CSOM solutions. We came across this script on github. Thanks to Azam-A we had a base script to work from. What we changed/added were the following;

  • Referenced the new wave16 components as Office 365 is already on wave16.
  • Added feedback in the script when it runs. It'll show for each item the item ID for which the script is starting the workflow.
  • For obvious security reasons we're not storing the user's Admin password as plain text, but prompt for the password.

Prerequisites Powershell

Step 1. Gather required parameters

For the script to run properly, you'll need the following parameters;

.SiteUrl: This is the full URL to the (sub)site where the list is stored for which you want to run the workflow

.ListName: This is the Title of the list for which you want to run the workflow

.UserName: This is the UserName that has enough permissions to run the workflow

Step 2. Run PowerShell script

Start Windows Powershell as administrator.

Be sure to first set the ExecutionPolicy correctly so you are able to run scripts. Set-ExecutionPolicy Unrestricted [ENTER] Input "A" for all After the ExecutionPolicy is set, we can run the script file.

Copy/Paste the code below in a txt file and save as an *.ps1 file (in this example "StartWorkflow.ps1"). Fill out the parameters with the gathered information and run the script.

StartWorkflow1 screenshot in  Powershell
StartWorkflow1 screenshot in Powershell

[code language="powershell"] # Add Wave16 references to SharePoint client assemblies and authenticate to Office 365 site - required for CSOM Add-Type -Path (Resolve-Path "$env:CommonProgramFiles\Microsoft Shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.dll") Add-Type -Path (Resolve-Path "$env:CommonProgramFiles\Microsoft Shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.Runtime.dll") Add-Type -Path (Resolve-Path "$env:CommonProgramFiles\Microsoft Shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.WorkflowServices.dll")

# Specify tenant admin and site URL $SiteUrl = "https://[TENANT].sharepoint.com/" $ListName = "[TITLE OF THE LIST]" $UserName = "[USERNAME]" $SecurePassword = Read-Host -Prompt "Enter password" -AsSecureString

# Connect to site $ClientContext = New-Object Microsoft.SharePoint.Client.ClientContext($SiteUrl) $credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($UserName, $SecurePassword) $ClientContext.Credentials = $credentials $ClientContext.ExecuteQuery()

# Get List and List Items $List = $ClientContext.Web.Lists.GetByTitle($ListName) $ListItems = $List.GetItems([Microsoft.SharePoint.Client.CamlQuery]::CreateAllItemsQuery()) $ClientContext.Load($List) $ClientContext.Load($ListItems) $ClientContext.ExecuteQuery()

# Retrieve WorkflowService related objects $WorkflowServicesManager = New-Object Microsoft.SharePoint.Client.WorkflowServices.WorkflowServicesManager($ClientContext, $ClientContext.Web) $WorkflowSubscriptionService = $WorkflowServicesManager.GetWorkflowSubscriptionService() $WorkflowInstanceService = $WorkflowServicesManager.GetWorkflowInstanceService() $ClientContext.Load($WorkflowServicesManager) $ClientContext.Load($WorkflowSubscriptionService) $ClientContext.Load($WorkflowInstanceService) $ClientContext.ExecuteQuery() # Get WorkflowAssociations with List $WorkflowAssociations = $WorkflowSubscriptionService.EnumerateSubscriptionsByList($List.Id) $ClientContext.Load($WorkflowAssociations) $ClientContext.ExecuteQuery()

# Prepare Start Workflow Payload $Dict = New-Object 'System.Collections.Generic.Dictionary[System.String,System.Object]'

# Loop List Items to Start Workflow For ($j=0; $j -lt $ListItems.Count; $j++){ $msg = [string]::Format("Starting workflow {0}, on ListItemId {1}", $WorkflowAssociations[0].Name, $ListItems[$j].Id) Write-Host $msg #Start Workflow on List Item $Action = $WorkflowInstanceService.StartWorkflowOnListItem($WorkflowAssociations[0], $ListItems[$j].Id, $Dict) $ClientContext.ExecuteQuery() }[/code]

If, for some reason, you want to stop/terminate all workflows, check this blogpost: PowerShell: Terminate a workflow for all items in a list on SharePoint Online