SharePoint

Improve internal news with the new Attini Feature: Shadow Copy

With the popularity and the usage of our internal news app Attini Comms expanding, we figured it was time to bring a new feature into the game for those organizations that have become lean, mean news machines. It is called Shadow Copy and it allows you to copy an article that you found in a random channel to your own channel. This way news can spread even faster.

Who is it for

This feature is very useful for organizations with over ten news channels and/or where the creation of news is a decentralized activity. It will help owners of a channel to easily bring in interesting articles from other channels and present them to their own audience. And the readers of those channels of course also benefit, for this will bring more news their way.

How it works.

Since the Shadow Copy feature is quite a big pack of functionalities, it helps if we explain the way it works based on a scenario. So, let’s first meet the stars of our story.

First up is Mark, the Attini administrator of Basically A Random Company Inc. (BARC Inc.) who makes sure that all the twenty channels that he administers are kept in tip top shape. Second is Nicole, the Director of the Sales Department and owner of the Sales news channel. Third we have Harold, an employee of BARC Inc.

Let’s say that in the IT channel of BARC Inc. a very interesting article has been published about how to safely connect to Wi-Fi networks at work. The tips and tricks mentioned in the article are very useful to all the colleagues working for the IT Department to setup secure connections. So, Mark has read the article and during a chat at the water cooler tells Nicole and Harold about it. Harold is a real news buff who has already read the article and encourages Nicole to check it out. Nicole knows that a lot of Sales colleagues are working from all different places, so she decides to look up the article in the IT channel.

When the three colleagues look at the article they all have a different set of options presented to them due to the new Shadow Copy feature:

In the top right corner of the article page, a menu is presented based on the permissions of the user looking at the page. Each channel for which you have permission to publish an article will be shown. That means that Mark sees all channels (because he is the Attini Admin), Nicole only sees the Sales Channel (because she has Contribute permissions only for that channel) and Harold doesn’t even see the menu since he owns no channels at all.

After reading the article, Nicole decides to copy it to the Sales Channel. During the copy a new article is created within the Sales Channel. However, this is no ordinary article. For one, Nicole cannot edit the content of the article. She wanted to copy the article and that is what she gets, an exact copy. But more important, a relationship is created between the original article (further known as the mother) and the copied article (further known as the child).

This relationship is used to update the child article as soon as a change is published in the mother article and to show where the child article originally came from. Showing were the article originally came from is important information. It leads Harold the news buff from the child article to the channel that possibly contains more articles about internet security. And it leads other channel-owners to the mother article from which they can also create a copy for their news channel. This is important because the feature does not allow copying a child article. To put it in family terms, we will allow creating sibling articles but aim to avoid a grandchild or even greatgrandchild article being created.

Now that the article is copied, Nicole has brought the tips and tricks about internet security to her audience. With an easy click of a button she has made the Sales colleagues happy. However, we did not forget about Harold our news buff. Because like a true news enthusiasts, Harold has subscribed himself to all the twenty channels that BARC Inc. has. Which means that he gets all the articles from the IT Channel (containing the mother article) and all articles from the Sales Channel (containing the child article). Of course, we filter out the duplicates from every feed so no one sees the same article twice. Mark, Nicole and Harold see the following items in their feeds.

Harold only sees one of the Internet Security articles, because the duplicates are filtered out of his feed. In this case the child is filtered out and the mother article is featured in his feed. As long as the reader is subscribed to the channel which holds the mother article, they will only see the mother article in their feed. If an article is copied to multiple channels, so multiple child articles exist, and a user is not subscribed to the channel holding the mother article, then the oldest child article that would be presented to this user would be visible.

So, if the Internet Security article would be copied to a third channel, Harold would still only see the mother article in his feed. In the event that Harold would unsubscribe himself from the IT channel, that holds the mother article, he would then only see the oldest child article in his feed, which is the one in the Sales channel, because Nicole copied the article first.

By filtering out the duplicates we can guarantee that the Shadow Copy feature can be used without limit, bringing news articles to new audiences and not polluting the feeds of people who already had access to the article. And by only showing the oldest article a user is subscribed to, we make sure that every user sees the article when it has the most news value for them.

Customers using the <link to sticky news>Sticky news feature</link to sticky news> feature need not fret as the deduplication feature of Shadow copy means that Harold will only see one of the sticky articles if the sticky article was the one copied by Nicole.

How to get started

Are you interested in adding the Shadow Copy feature to your Attini Comms installation? Than feel free to contact us by sending an email to attini@rapidcircle.com. We will be happy to run you through all the details and get you started!

Do you want to give your company news an additional boost in reaching all the colleagues, please make sure you check out the Sticky News feature we just released.

PNP PowerShell: Maintain all your Termset data across tenants

The Term store manager available in SharePoint enables companies to manage their enterprise-specific taxonomy easily through new term groups and term sets. This metadata can then be referenced by users for selecting choices when filling in profile or content-related data.

Enterprise taxonomies can sometime contain dozens of groups with too many term sets and terms to manage, update of copy manually. There are standard ways offered to export taxonomies into a .csv file and importing them to term store on a different tenant.

But what if you want to not only export term sets and term labels but also their other term-specific data and configuration such as:

  • Localised labels for each term
  • Synonyms
  • Navigation settings (if the term set is intended for site navigation)
  • Custom properties associated with each term
  • The exact GUID of the term

The above data may not be interesting for users, however for administrators and content creators and developers these additional elements of a term can are very important.

Fortunately, we can export all the term sets configuration using the powerful and very useful PNP PowerShell cmdlets

Thanks to the efforts in the Microsoft Patterns & Practices (PnP) community, we now have a set of useful PowerShell cmdlets that can help us. The list of cmdlets is continuously growing and we find that as administrators we can accomplish many more tasks using CSOM

Specifically, the cmdlet that we can use is:

Export-PNPTermGroupToXml – Enables us to export a Term group and all its underlying terms’ setting to an xml-based output file

ImportPnPTermGroupFromXml – Enables us to import a Term group from an XML file

Export your taxonomy

To use the cmdlet, I first need enter credentials connect to my SPO tenant content type hub site collection:

Once connected I simply need to pass an output file and the term group name I want to export

Looking at the exported XML you can see that all the relevant term settings included GUID are now available to import to another term store

Importing your taxonomy

The import is done in a similar manner.

Connect to the destination tenant

Pass the XML file as a parameter as seen below

That’s it!

Improve internal news with the new Attini Feature: Sticky News

  All news is important, but some news is more important than other. And that is why we bring you Sticky News as a new feature inside the number one Office 365 News application.

Sticky News is a feature that is available as of this summer for Attini Comms. The new feature allows the creator of a news article to flag their article as being extra important which will make sure the news article remains at the top of the list in the news feeds. So instead of being pushed down by other articles with a more recent publishing date, the sticky news article will remain at the top of the news feed for the number of days that was specified by the writer of the article. To give it an extra touch of importance it is possible with a slight design change to highlight a sticky news article by displaying it in an offsetting color, depending on the type of Attini Reader web part you are using.

How To Use

As you have noticed, the technical and functional changes that are required to turn a news article into a sticky news article are not that major. However, the feature has proven to be a total game changer for organizations that are already used to Attini. The reason why is simple: having the ability to make your articles sticky, gives you a big advantage when spreading your news. And as our favorite spiderlike superhero so obviously stated “with great power comes great responsibility”.

The Sticky News functionality can be granted to each news channel that you have within your Attini Comms setup and this is done via the Attini Comms Dashboard. This means that only an Attini Administrator has the power to turn a news channel into a sticky news channel that can produce sticky news articles. Also, it is good to note that the power can be taken away as well by the Attini Administrator. So, the control over who gets the power and who doesn’t lies with the person administrating the whole Attini Comms landscape at the customer side, exactly in the place where you want to have this control.

Once the Attini Administrator has granted the Sticky News functionality to a channel, the owners of that channel have the power to create Sticky News articles. Per article they have the option to upgrade it to a Sticky News article or just publish it as a regular article. And since the difference is indicated by a simple flick of the switch, it is easy to make your news sticky. Even if you posted an article as a regular news story and the next day you decide to make it Sticky, you simply check the box and publish the article again to move it to the top of the feed.

Per channel you can have a maximum of three Sticky News articles. Let’s assume you post one sticky news article every day. If you would start on a Monday then Monday sticky article is on spot one and the rest of the spots are filled by other articles based on the priority model that you have chosen (most recent, most liked, most viewed, etc.). On Tuesday, your Tuesday sticky article will take spot one, the Monday sticky article will move to spot two and spot three and lower is taken by the rest. On Wednesday, your Wednesday sticky article will take spot one, the Tuesday sticky article will take spot two, the Monday sticky article will move to spot three and spot four and lower is taken by the rest. On Thursday, everything moves again when your Thursday sticky article is published, but now the Monday sticky article will not be considered sticky anymore and will get ranked amongst the rest according to the ranking model. Below, an overview is given of how the feed would look on each of the days.

Up to this point the story is plain and simple. You have one channel fitted with the sticky news and the owner of that channel can choose which stories to make sticky. However, most of our customers have multiple news channels, which in practice means anywhere from two to over fifty channels. And that is when the Sticky News scenario becomes interesting. Because the more news you have, the bigger the benefit is to have a way to make an article stand out. On the other hand, the more people who have the power to put their news at the top of the feed for days, the higher the risk will be for editors gaming the system.

To make sure you get the benefit from the Sticky News functionality and not turn the business of publishing news into a free for all fight about who can get their article at the top of the feed, some best practices need to be taken into consideration.

  • First, only channels with a wide audience should be candidates for the Sticky News upgrade. If a channel only has 50% or less of the company in the audience set, then the news published there is already targeted at such a specific group that probably every story is equally important for the reader.
  • Second, only channels with a high frequency of publishing articles need a functionality like Sticky News. If you only publish an article once a week, it will stay at the top of your feed anyhow, so no need for making it stick.
  • Third, make sure that if multiple people have the power to create Sticky News that they communicate with each other on a regular basis. If the power lies with a central communications team or colleagues near one another they have ample opportunities to discuss which stories should dominate the feeds the coming days. The situation that you want to avoid is two colleagues that never talk to each other battling it out over the news channels to get their news on top. The result of such a situation is seen in the comment section of every YouTube video and always turns ugly.
  • Fourth, be sure to monitor the use of the Sticky News functionality after it has been granted to a channel. If the functionality is not used properly (or not used at all) it is worth a conversation with the owners of that channel. And don’t be shy to take away the power again, because having not so interesting articles dominating the feeds may make the creator of that news very happy, but could make your readers frustrated.

How to get started

Are you interested in adding the Sticky News feature to your Attini Comms installation? Than feel free to contact us by sending an email to attini@rapidcircle.com. We will be happy to run you through all the details and get you started!

Do you want to give your company news an additional boost in reaching all the colleagues, please make sure you check out the Shadow copy feature we just released.

PNP PowerShell: Managing Content Type Artefacts across a single or multiple Office 365 tenants

Creating content types in Sharepoint has always been relatively easy for site and content administrators. Furthermore, with the Content Type Hub feature, custom content types can be centrally defined and pushed out to all site collections. The challenges and difficulties, however arise when you want to make some inherent changes to these site objects or want these exact site objects to be present across your DTAP (Dev, Test, Acceptance & Production

For instance,

  • I’ve created my custom content types in my dev tenant. Now I want to migrate the changes to production?
  • How can I update an internal name of a field with a content type and ensure that the changes are reflected everywhere?

Actions like these were (and still are) generally avoided because there’s be no good way of accomplishing them. It’s still very good practice to thoroughly prepare and review what’s needed before creating custom content types. Making changes to these artefacts still requires effort especially when there is content that is already using these artefacts.

Fortunately, the ability to manage existing content types has gotten easier. Thanks to the efforts in the Microsoft Patterns & Practices (PnP) community.

We now have a set of useful PowerShell cmdlets that can help us. The list of cmdlets is continuously growing and we find that as administrators we can accomplish many more tasks using CSOM

You can go through the PnP Cmdlet documentation here https://github.com/SharePoint/PnP-PowerShell/tree/master/Documentation

I want to focus on creating content types and managing changes to these artefacts you use the following 2 PNP cmdlets

Get-PnPProvisioningTemplate: Enables you to extract a site template with all or a partial set of the site artifacts. The extraction is an xml file which can be reviewed and updated

Apply-PnPProvisionTemplate: Enables you to apply an extracted site template to an existing site. Essentially providing you with a means to apply changes how to all sites in a tenant or a different tenant

The overall process then would look like this:

Create custom artefacts in content type hub

As usual create your fields and content types in the content type hub. I recommend to:

  • Tag these artefacts in a custom group so they are easily identifiable
  • Decide on a naming convention for both fields and content types that helps others to see that these are custom artefacts
  • Avoid spaces in field names when initially creating them. Otherwise you end up with internal names looking like this

Where the space is replaced with a hexadecimal “_x0020_”. This is not a critical issue, however can be avoided and corrected.

I’ve created a content type in a unique group:

With a custom field Document Category

Extract artefacts using Get-PnPProvisioningTemplate:

Using the cmdlet, I can first enter credentials connect to my SPO tenant content type hub site collection

Then extract only the Fields and Content Types using the -Handler attribute

Make changes to your artefacts in XML

In your xml output file, you will find all the Fields and Content Types. You search for the relevant ones by looking for the group name (“Demo Only” in my case)

You can now edit field properties such as the StaticName and Name

Be sure to update the reference to the update field name in the corresponding content types as well. In my case I had created a “Demo Content type”

Modified to

Once your satisfied with you changes save the XML file and you are ready to apply the changes to the original content type hub site collection

Apply changes using

Connect to your content type hub site collection again:

Run the Apply-PnPProvisioningTemplate with the updated xml file as an input:

I changed the static name of “Document_x0020_Category” to “Document_Category” which is not reflected in when viewing the field column URL:

This was a simple demonstration of the scripting tools available to manage site artefacts change that previously were difficult or impossible to update.

Changes can now be pushed out to all site collections by republishing the updated content type:

Using this same technique, with a bit more preparation you can also extract a set of custom content types for one tenant and apply them to another. Thereby keeping field names, content types and their internal GUIDs all intact!

Microsoft Forms preview: The ins & outs

Microsoft Forms was formaly introduced via an Office Blog post "Microsoft Forms—a new formative assessment and survey tool in Office 365 Education" and in preview since April 2016 for Office 365 Education subscribers. It allows users to create quizzes, questionnaires, assessments and subscription forms.

The product

Microsoft Forms is a product that specifically targets the education market and allows users to create web based forms in which different types of questions can be created. The tool lends itself to create a pop quiz for a classroom, a questionnaire to gather qualitative information about a topic or a simple subscription form. As said it is specifically targeted towards the education market and therefore only Office 365 Education licensed users will be able to use the product (in preview). Watch the video released by the product group below;

Microsoft aims to deliver an easy and fast solution for teachers to create assesments, which can filled out via all types of browsers on all types of devices. Don't let the simple interface fool you,you do have powerfull options available, such as validation, notification and export to Excel.

Not the new InfoPath

If you are anything like me, your first reaction when hearing that there is something new called Microsoft Forms will most likely to be: “Finally, the replacement product for InfoPath has arrived!”. Well, it has not. Microsoft Forms does not come close to the full suite of options we know from InfoPath. And more importantly, there are no signs whatsoever from Microsoft that it is supposed to replace InfoPath in the future. For that we have to look at Microsoft PowerApps. Microsoft Forms is a product to create assessments, quizes, surveys, etc. Let's show the power of the product by building a little quiz.

Pop quiz!

Let's jump in the product and create a Pop quiz! It'll show off what the poduct is really good at; creating a questionnaire.

The Basics

Microsoft Forms logo
Microsoft Forms logo

So let’s find out what it can do and take a closer look to this new addition in the Office 365 family. First off, Microsoft Forms is a legit product within the Office 365 suite for education licensed users and therefore you start it, like any other product in the suite, from your app launcher.

Launching Microsoft Forms brings you to the My Forms overview, where all your Forms are shown, and a button to start creating a new one. My overview looks like this:

Forms
Forms

When you click the new button you a Form builder is loaded which allows you to enter a title, an introduction text and start adding questions. There are five types of questions that can be added to a Microsoft From.

Forms
Forms

First, we have the “choice” type which allows you to define a question and list a set of options that can be the answer. Unique to this the “choice” type is that you can use the “other” option if you want to provide a way for your form users to answer outside of the given options. This type of question is typically used for general information questions where there is no wrong answer like: “how did you find out this quiz?”.

Forms
Forms

Second, there is the “quiz” type which also works with defined answer options. Unique to this type is that there is actually a correct answer which can be set. Also you can provide feedback for each option to explain why an answer is correct or incorrect. The “quiz” type question is really the one that gives Microsoft Forms its educational flavor, because this is used to verify knowledge instead of gathering information.

Forms
Forms

Third, the “text” type for which the answer is given in a text box. Unique to this type is that there is an option to allow for a long answer, which gives the person taking the quiz a bigger textbox for the answer.

Forms
Forms

Fourth, we have the “rating” type which allows you to answer using a scale. This scale can be set to stars or numbers and can run from 1 to 5 or from 1 to 10. The “rating” question is often used in questionnaire to gather information about how the test subject agrees or disagrees with certain statements.

Forms
Forms

Fifth and last, there is the “date” type question for which the answer is given by selecting a date from the calendar. A date type answer is often seen in subscription or application forms or in questionnaires to ask about someone birthday for example. However, with a little creativity you can work this type of question into a quiz if the answer is a date (perfect for history exams) with is question like: “What was the founding date of Rapid Circle”.

Forms
Forms

Advanced options

For each of the five types of questions you can indicate if a question is required or optional. This is almost common practice with any type of question tool, but since it is such a powerful way to ensure data completeness I did not want to let this go unmentioned.

Also, for all types of questions you have the option to add subtitle. This could be used for providing a hint about the answer or giving guidance about how to answer the question.

“Choice” and “quiz” type question can be turned from single answer questions into multiple answer questions with a flick of a switch. However, the way that Microsoft Forms is letting the user know that multiple answers are possible is very subtle. For single answer questions the option selection boxes are round and for multiple answer questions the option selection boxes are square. So when making a multiple answers question, I would definitely recommend putting something like “(multiple answers possible)” into the question. Otherwise you well surely get complaints from your quiz takers.

Also for “choice” and “quiz” type of questions you can select the setting to shuffle the options. This will present the answers in a different order every time the quiz is loaded, which has several advantages. One, and I know you are thinking the same, it makes it harder to cheat. Two, when looking beyond the possible bad behavior of quiz takers, there has been a lot of research on how the order in which options are presented influences the option that is most likely to be chosen by quiz takers or the most likely to be correct. So if you as a quiz creator want to remove this bias, shuffling the answers is a nice option that helps you.

For the “text” type question, it is possible to provide restrictions. For example that the answer should be a number (nice for math problems) or that the answer should be between two values. All the restriction options are number based restrictions, so they actually help you to turn the “text” type question into a sixth type of question, namely the “number” type.

For the form as whole there are also some additional settings that can be turned on or off. For example, you can choose if you want to apply a deadline or if you want to shuffle the questions.

Forms
Forms

Sending out the Quiz

When you are done creating the quiz there are several ways to send out word about your newly created quiz. Obviously you can share the link by copying and pasting it to a certain location or email the link.

But next to that, Microsoft shows a nice realization of their mobile first strategy by allowing you to create a QR code for your quiz so people can scan it with their smartphone. Of course we did a test among colleagues, and it worked liked a charm. This function is especially interesting when promoting a training or event for which users need to subscribe. On the poster or flyer you can easily include the QR code so people walking by can scan it and immediately subscribe.

The last way to offer your quiz to users is by embedding it onto a webpage. This could be a SharePoint page, but any other webpage will do as well.

Forms
Forms
Forms QR
Forms QR

When spreading the word about your quiz, questionnaire or subscription form you can still control who can fill it. While the options are not very extensive (to say the least) the most important choice is available, which is to allow people outside your organization to fill out the form.

Forms
Forms

Feedback to the User

When someone fills out you Microsoft Form they get a piece of feedback after submitting. Next to the standard messages that thank the user for submitting and verifying that the form was submitted successfully, extra feedback is given when “quiz” type questions are incorporated in your form.

First, as discussed, a “quiz” type question offers the option to provide a comment per answer option which is shown after submitting the form. Second, in the advanced settings you can determine if the user should see the correct answer for a “quiz” type question after submitting. And Third, a user score is calculated based on the amount of “quiz” type questions they have answered correctly.

This last one is a bit tricky because it only looks at the “quiz” type questions in the form. So if you have a form with 8 questions and 4 of them are “quiz” type question, then the maximum score a person can get based on the feedback is 4 out of 4. From a technology point of view it makes sense, because for the “choice”, “text”, “rating” and “date” type questions you cannot indicated what the correct answer is so it just ignores those questions. But from a user experience it is pretty weird if you just answered 8 questions and you see that your score is 3 out of 4. And since there is no option to switch off this feedback about the user’s score, this definitely takes some communication effort to avoid confusion or complaint. So I would advise you to add a note covering this in the description text at the top of the form.

The responses

If you did a good job building and sharing your Microsoft Form, you will have plenty of responses in no time, which are automatically analyzed for you in the responses section of your form. Here you will find some statistics about the form as a whole and more detailed statistics about each individual question.

Forms
Forms

I have to say that the automatic statistics that are generated are quite good and cover the basic requirements around insight in your responses. But before we go into detail, I would like to point you to the “Open in Excel” button at the top right hand side which will allow to completely go berserk in analyzing the responses in your own way.

Forms excel
Forms excel

For “choice” and “quiz” type questions the responses are presented in a table like fashion as well as a chart. For “text” and “date” type questions the number of responses are presented along with the last three responses. And for “rating” type questions the number of responses is shown together with the average rating.

And for each question you have to option to click the “Details” button which shows all the responses for that particular question in a dialog box.

Forms
Forms

Final thoughts on Microsoft Forms preview

Microsoft Forms is a very complete quiz tool that will help you to create quizzes, questionnaires and simple subscription forms in a quick and easy way. Especially for a product which still is in Preview, I have to say that this first version already covers a lot of requirements. However, there are two major points of critique when looking at Microsoft Forms.

First, the name. It is very misleading in the sense that it brings high expectations to anyone who knows about the fact that InfoPath will be leaving us in the future. Because if you review Microsoft Forms Preview from the perspective of it replacing InfoPath, then you will be very disappointed.

Second, the audience. Microsoft offers the Preview exclusively to Office 365 education licensed users, while this product can also be very helpful outside the educational realm. Many corporations, government bodies and non-profit organizations could use this product. Creating a quiz for your internal training programs, making a questionnaire for customer satisfaction research or building subscription forms for an event is daily business for any type of organization and therefore the restriction to only offer this product to the educational market seems like a strange strategy. It even feels unfair for non-education licensed users. Logically, there are many many people lobbying to bring Microsoft Forms to all Office 365 users when it becomes Generally Available and I am one of them.

So Microsoft Forms shows to be a promising tool for creating quizzes, questionnaires and subscription forms. It covers the basics and in 90 percent of cases will do just fine. But it is not the long awaited replacement of InfoPath, so that remains on the wish list, and will live a life in the shadows of the Office 365 suite if it remains to be solely targeted at education licensed users.

FAQ

How can I get Office Forms Preview?

Sign up to gain access to the preview via https://forms.office.com. Unfortunately it's only available right now for Office 365 Education and the US market. If you are outside the US, but do have access to an Office 365 Education tenant. Sign up, but fill out an US address.

Will it only be available for Education tenants?

At the moment it's only available for Education tenants. Microsoft is exploring all posibilities, but has nothing to share about that as of yet.

Will it be available in my region/language?

Yes, Microsoft Forms will be launched for all Office 365 Education regions and languages.

Is this the Infopath replacement?

You might think that when you read the product name, but... No, this isn't even close. Look at Microsoft PowerApps as the Infopath replacement

Is this the final product?

It's in preview with no live date set, so you may expect changes. These can be small and/or large. If you'd like you can contribute via the feedback button when you're using Office Forms or post your ideas and upvote others on the Office 365 uservoice (https://office365.uservoice.com/).

Is there a Microsoft Support article available?

Yes, use your favorite search engine or follow the link: Microsoft Support - What is Microsoft Forms?

The preview is available for US right now. Anything I should be aware of when outside the US, but still apply?

Yes, as it's running for US only right now, all data is stored in the Microsoft Data Centers in the US. So if you're in Europe for instance, the data entered in Microsoft Forms preview will be stored on US servers. This will be until the product becomes available for your region.

What will the future of development in SharePoint look like? Pretty much like it does now!

Managing the lifecycle of the components you build, deploy, operate and support on Office 365 is difficult. The platform is new, the cloud is new, and much of your classic DTAP model is simply no longer appropriate. We at Rapid Circle are committed, for some time now, as being a Microsoft Cloud company rather than just a SharePoint company and the recent developments around SharePoint announced at the Future of SharePoint event on May 4th continue to vindicate our stance.

Specifically, the announcement of an upcoming SharePoint Framework (SPFx) makes it clear that client-side development and the use of JavaScript are encouraged and promoted for custom solutions on Office 365!

Some key aspects relevant for developers and administrators that we found very interesting were

  • The framework is fully JavaScript based model.
  • There is no single JavaScript framework mandated. We can still use popular frameworks such as Angular, Knockout and Handlebars.
  • Node.js & Gulp tasks are used for packaging deploying components.
  • The Local development model is going to be very different. A SharePoint workbench is introduced where Gulp and node.js will used to host files locally, so you don’t need to use IIS on your local machine.
  • Visual Studio Code, an open source code editor is being promoted as the preferred tool. Also clearly indicating that Visual Studio is not a must-have requirement for developers.

It's important to note that SPFx not a radically new model. The framework might be new, but we've been doing client-side development for some years now using similar frameworks. Finally though, Microsoft is creating this framework which leverages techniques we already use such as CSOM, REST API's. Taking it a step further Microsoft is openly embracing open source technologies such as node.js, Gulp, Yeoman and more.

Our Rapid ALM tooling is very much aligned to what Microsoft has described in the event. It’s a development & delivery model that’s entirely JavaScript based and enables us to streamline everything from the core development of our Instant Intranet components, integrated with our test teams, and deploying to our client tenants. Furthermore, it’s built on and resides in the SharPoint online platform.

Using Rapid ALM to evolve our Instant Intranet solution we are also “early adopters" for Visual Studio Code as the default editor for our apps development where we use Angular 1.4, Require JS. Moreover, we utilize NPM and Gulp for building, packaging and deploying our Instant intranet apps. Finally we use Git for source code control & versioning.

Future of SharePoint Development model
Future of SharePoint Development model

With this development model, we have successfully created and setup intranet for customers in the first quarter of 2016.  This new development model in SharePoint is giving us the platform for Rapid and Instant service to our customers.

And we're looking forward to the upcoming SharePoint Framework and will continue to evolve & enhance our own to software development lifecycle!

This blog post is part of the series Future of SharePoint. More on this topic can be found at http://08b.4d7.myftpupload.com/tag/FutureOfSharePoint/

A Preview of Microsoft Flow: the Pro's and Con's

1. What’s Flow?

Microsoft Flow is a new product that allows the creating of cross application action-reaction scenarios which was announced at the keynote of the Future of SharePoint. Flow allows you to define what should happen if a certain event occurs, also known as the “IF This, Then That” (IFTTT) [1] model. It is currently in preview and at https://flow.micosoft.com you can sign up and get started on creating your own flows.

The whole premise is that you pinpoint the events that are important for you and decide what should happen if such an event occurs. To name just an example: let’s imagine that you are a project manager for a top priority project and you want to keep a close ear to what people say about this project. You could create a Flow that keeps track of Yammer and creates an entry every time someone mentions your project on Yammer. With Flow this is possible.

2. So the new Workflows basically?

Officially Microsoft has not yet stated whether or not Flow is going to replace SharePoint Designer workflows. But as we all know, SharePoint Designer 2013 (because SharePoint Designer 2016 was an exact copy of SharePoint Designer 2013) was the last release of SharePoint Designer we can expect from Microsoft and with that we will say goodbye to workflows as well. This makes Flow the prime candidate to replace SharePoint Designer workflows. What I did find in the communication is that the product team will focus on “…adding ways to leverage your existing SharePoint workflows in Microsoft Flow.”[2]

I personally am a big fan of SharePoint Designer workflows, so when getting my hands on the Microsoft Flow Preview I could not help comparing it with the possibilities that SharePoint Designer workflows bring me.

3. What are the Pro’s

Flow should not be simply seen as a newer version of workflows, because it is a whole other platform and approach to support “if this, then that” (IFTTT) [3] logic. So what makes Flow great?

Cross site workflows

SharePoint Designer workflows only operate within the borders of a site. So all lists and libraries that are involved in your logic process need to be in 1 site. With Flow it does not matter where your items are stored on SharePoint. It can trigger the creation of an Announcement of the Management Team site because a deal was completed in the prospects list of the sales department site. It even does not matter if these site reside in the same site collection or the same tenant.

Flow1
Flow1

Cross application workflows

The cross platform approach of Flow even goes way beyond the borders of SharePoint by allowing you to tap into all sorts of different applications. If we go back to our previous example of the project manager keeping his ears open to the conversations on Yammer, we could expand that scenario by including other social platforms like Twitter and Facebook. So whenever the project is mentioned on either Yammer, Twitter or Facebook we record an entry in a SharePoint list. Or maybe we do not want to use a SharePoint list, but log everything in an excel file so that we can crunch the numbers later and do some slicing and dicing analysis on the mood around this very important project.

With Flow all the above mentioned application can be linked to each other and used for your IFTTT scenarios. Below you find an overview of applications that are currently included in my preview version of Microsoft Flow.

Flow2
Flow2

Recurrence

Everybody who has experience with SharePoint Designer workflows will have had a firsthand taste of how difficult it is to add recurrence to your workflow. Starting a workflow on a certain date or time required making use of retention policies and making a workflow recur every hour, week or month meant pulling out a whole bag of tricks. But not anymore with Flow, because you get recurrence out of the box which allows you to run your Flow every day, hour, minute or even second!

Thinking of our example case. We could add a recurrent step that sends out the mood analysis about the project every week.

FLow3
FLow3

User Profile Lookup

Flow allows you to lookup a User Profile as one of the action. This can be your own profile, a specific user profile or the profile of a user based on a search. Even getting the profile of someone’s manager is no problem, which is a real help in approval flows.

With looking up the profile, you also get access to the field that are in the profile. And since you can add fields to the Office 365 profile of your users which make sense for your organization, this means that you can use those organization specific profile fields in your Flow. Thinking back to our example, we could lookup certain details about the people that mention the project, for example the department they work for or the country they are based in. This could increase our insight in which parts of the organization the project is on the agenda.

Flow4
Flow4

Templates

Where SharePoint Designer Workflows came with a set of predefined templates that you could use straight out of the box, Flow goes the distance when you consider the amount of templates that are offered[4]. Also, the set of templates is growing every day, because you can choose to share your self-created flow with the community.

4. What are the Con’s?

As said, Flow is in premium state and thus still being worked on. Therefore, it is expected to have some bits and pieces missing. So what are the things that are not so great about Flow?

Everything is personal

One of the first things that I noticed is that, because Flow spans its logic across many applications, I was entering a lot of usernames and password to prove that I had the correct credentials for all these applications. Whenever you enter credentials, a connection is added to your Flow account in order to allows you to run the steps in your Flow that access that particular application. And you can manage these connections your personal connections overview.

Flow6
Flow6

This however begs the very important question: Is every Flow and impersonation workflow? And while I could not find a definite answer in the documentation [5], I cannot see any other conclusion than that a Flow you create only works based on the connections you have defined. This means when you build a Flow that start when an item is modified in a certain list, that anyone who has the permission to modify items in that list can trigger the Flow to start and will make use of your connections to go through all the defined steps. So any item that is modified, any item that is created, any email that is sent is sent using your credentials.

This is a very big deal and requires serious thought before activating a Flow and letting it run on a library or list. As highlighted in the introduction of this blog, Flow helps you to organize

Start other flows

It is not possible to start a second Flow as an action of your primary Flow. While this option brought much happiness for the users of SharePoint Designer 2013 workflows, it is not included in the set of actions in Flow. Which is a pain, because this will again lead to the situation that every single variation and exception within your logic process has to be included in the same Flow. This makes building and maintaining Flows unnecessarily complex.

Reorder steps

I almost could not believe it, but I cannot rearrange the steps when building the flow. When you start building you have a first step and a plus sign beneath it. But after adding three or four steps I could not squeeze another in between the ones I already had. This would mean that each time you want to go back and add a step in between others, you would have to delete everything first, add it, and then create the deleted steps again. I dare to say that this is not just user unfriendly, it is just ridicules.

And/Or conditions

In Flow you still rely on Conditions and Actions to create your Flow. But the options you have with formulating your conditions are greatly reduced compared to what you are used to in SharePoint Designer, where putting multiple conditions together immediately results in a AND or OR logic. In Flow this is not that easy. If you put two conditions beneath each other you have to define actions in between, so no possibility to define a OR scenario. And the AND scenarios are only possible by putting a whole series of conditions beneath each other and leaving out an action for the “If yes” path.

Flow7
Flow7

For conditions you do have the option to move into advanced mode within Flow, but that requires you to learn the new type of syntax building to create your AND or OR condition. Oh, and when you choose to go into advanced mode, there is no turning back to the “not advanced mode”. Very user unfriendly to say the least.

Flow8
Flow8

The verdict

Microsoft Flow really is one of the game changing components of the Future of SharePoint announcement. While it is very tempting to compare it to SharePoint Designers workflows, it actually requires a whole different approach because of the cross platform possibilities, deep integration with office products and personalization.

The product on the one hand shows great potential but on the other hand still shows clear signs that it is merely in Preview. If I had to make up the score right now, I would already conclude that the good outweighs the bad, so I am very eager to see what will be added to Flow in the future.

I will definitely keep a close eye on the roadmap to see what is coming and advise everybody to do the same, because for me, Flow is absolutely part of the Future!

This blog post is part of the series Future of SharePoint. More on this topic can be found at http://08b.4d7.myftpupload.com/tag/FutureOfSharePoint/

Sources:

  1. IFTTT
  2. Flow Microsoft
  3. IFTTT
  4. Flow Microsoft Templates
  5. Flow Microsoft Documentation

Watch the keynote "The Future of SharePoint" here.

Lots of changes and innovations have been announced at the 'Future of SharePoint' conference last month. We've watched the keynote and found it very interesting. We think you might find this interesting as well. So below, find the full keynote!

Full keynote presentation "Future of SharePoint"

Want to know more?

Read more in-depth information in other blogs/articles by myself and my colleagues: List of #FutureOfSharePoint posts

This blog post is part of the series Future of SharePoint. More on this topic can be found athttp://08b.4d7.myftpupload.com/tag/FutureOfSharePoint/

Forms and SharePoint: Excellent question! 

The new kid in town

There is a brand new feature in Excel which allows you make good looking surveys superfast. And this new addition to your favorite workbook tool is placed front and center in your OneDrive (in my case the OneDrive for Business) and it will appear in your SharePoint document library as soon as you switch to the new look and feel (Read more about the new experience in document libraries here). When selecting new the Excel Survey pops up in the drop down and as soon as you create one, you quickly enter a name for your document and then you are building your survey straight away.

How does it work?

When you create an Excel survey you basically create a workbook and a nice interface for data entry. This was always possible for the Excel experts among us, but it is readily available out of the box to all and way faster than building it yourself in an ordinary workbook. The survey builder lets you pick a title and a description (or delete the placeholder if your survey doesn’t need them) and then you start defining the questions.

Forms and SharePoint: Excel Survey Blog
Forms and SharePoint: Excel Survey Blog

Every question has an additional settings menu where you can define the question, the subtitle, the response type, if it is required or not and a default answer. And off course you will find the add and delete buttons in this panel as you are used from Microsoft. There is no validation (a part from making questions required) and no special formatting or anything, just plain and simple, and above all superfast, survey building. And when I say fast I mean fast. The survey that I show below contains a title, description, five questions and was built in under ten minutes. And that was timed including the time to take screenshots of every step.

Once you are done with building your survey you simply hit the “Save and View” button to get a glance of how the survey will look to the people who you will be asking to take it. And if you like what you see you simply hit the “Share Survey” button and a link is generated that you can send to your audience.

What has happened in the background is that for each question that you created a column was made in your workbook. This is where the response will be stored of the people taking your survey. And the columns have properties that match the kind of questions they are linked to. So a text column is made to store the answers for a test question, a number column is made for a number question and a choice column is made for a choice column. Again, I not saying that this was not possible before in Excel, but this just makes it so much easier.

In light of being totally honest, I will add in the note that Microsoft also includes on their support page, which is that “Columns in the spreadsheet are built as you add questions to the survey form. Changes you make to the survey form are updated in the spreadsheet, unless you delete a question or change the order of questions on the form. You'll have to update the spreadsheet manually in those cases: delete the columns that go with the questions you deleted, or cut and paste columns to change their order.” (from: Office Support)

Why should I use Excel surveys?

For me this new feature really shows that Microsoft has a sense of what their customers are doing with their product. Because as stated earlier, building surveys in Excel has been done before. And also for other survey tools that Microsoft brought to life, for example the SharePoint Survey App, the data is stored in a list and usually analyzed in Excel. So building a survey feature into the product of Excel makes sense.

And there are some additional benefits next to the fact that your survey building time will become far shorter with this new feature. First, as explained, you only share a link with your audience. This means that you do not have to give the people who fill out the survey, access to the data in the workbook. This separation of data entry and data storage fits the security driven world we act in today. Second, since you share a link to a webpage with your audience, the survey is easily accessible from any device (desktop, laptop, tablet, mobile, etc.) as long as you have a browser and an internet connection. A colleague of mine opened the email I sent out on his phone and could fill in the survey straight away.

If have to put the excel survey in line with the other survey tools that Microsoft offers, I would put it as an equal weight and possible replacement of the SharePoint survey app. For the quick and dirty poll, you can use a third party poll app or even the Outlook voting buttons and for the structured business processes you can use InfoPath forms or a third party form builder like Nintex. But Excel surveys fits nicely in between to fit scenarios where you do have multiple questions to ask but it is an ad hoc or onetime thing that doesn’t need or justify developing a custom form.

This blog post is part of the series Forms and SharePoint. More on this Topic can be found HERE

Forms and SharePoint: InfoPath is here to stay

light-man-hand-pen.jpeg

Forms? Why should I care?

For most people the word “forms” brings to mind bad memories about bureaucratic procedures with lengthy enrollment sheets that require you to enter data that you know for certain you have entered before. I know of these procedures, been forced to go through a whole bunch of them and I too can think of at least one form that made me give up to even start the process it was for. But instead of yelling out “destroy, destroy!” whenever I come across a form I am still a huge fan.

The reason for my everlasting love is simple: garbage in is garbage out to any process and forms provide me with the weapon to ensure data quality on the point of entrance. This focus on data quality was taught to me at a very early stage in my career in light of the single source of truth principle along with the master data management mantra: “Create once, validate once, use many”. A line that is among my favorite quotes from the day that I first heard it.

But hasn’t InfoPath left the building already?

Well actually, no. There has been a lot of talk all over the place, including official statements from Microsoft itself, that InfoPath is on the way out. And while these statements where very strong and decisive in 2014 when Microsoft first came forward with the topic, “…there will not be a new version of InfoPath and support will continue through April 2013…[1], the 2016 release officially included InfoPath which means that by the rules of the Lifecycle Support Model support is continued until 2021 (2026 with extended support)[2]. And Microsoft themselves have updated their earlier strongly formulated statement about InfoPath with a friendlier “…InfoPath Forms Services will be included in the next on-premises release of SharePoint Server 2016, as well as being fully supported in Office 365 until further notice. Customers will be able to confidently migrate to SharePoint Server 2016 knowing that their InfoPath forms will continue to work in their on-premises environments, as well as in Office 365.[3]

And while Microsoft keeps saying that InfoPath 2013 will be the last version of InfoPath that will be released, I see no immediate reason to move away from a tool that works and will remain to work in your latest on-premises and online environments.

So what is it good for?

To possibly better understand my enthusiasm when it comes to forms, it is probably valuable to share that I am a process minded guy with a strong background in Operational Excellence. And thus I know all too well that a good process does not allow for defects to be created along the way. Because a defect either results in rework or scrap and both are a waste of time.

I am such a big fan of forms, because they allow me to eliminate defects at the very beginning. And InfoPath gives me a big box of tricks to enforce data quality at the entry point. To list my favorite ones:

Data validation

We have all seen form entries which contradict themselves. Sometimes it is an honest mistake and sometimes it is unthinkable stupidity, but it is always annoying. With data validation you can add rules that check if people are making mistakes while entering the form, for example by forcing that the return date of a reservation is later in time than the indicated starting date. And with the ability to include multiple fields in one rule, even the most completed scenarios (if A is higher than B and C is less than D than E cannot be higher that B plus C and not lower than D minus A) can be validated. And let’s face it, we can all name a process that has these kinds of if’s, then’s and but’s.

Formatting

Who hasn’t come across the quote “If you have answered “No” to this question, please continue to question number...”? And off course, it is always nice to be able to skip a question, but wouldn’t it be even better if the questions you do not need to answer are not even asked in the first place? With formatting this is possible, because based on given values you can determine to show a field or even a whole section of fields. So only those questions that need to be asked based on the previous answers will be shown. This also gives the user the feeling that the form is designed especially for him or her, which always wins you sympathy points.

Querying for data

One other great way to enhance data quality and reduce effort for your users is being able to query for data. The scenario comes to mind of an ordering form, which always need to contain information about a product and a customer. You know this information is available somewhere in the organization already and probably stored in a very structured manner. So why not tap into this source. By providing the product code in the form you can fire of a query that gets all the additional information about that product that you need. This has two big advantages: 1) as a user, I only need to type in the product code and 2) as an administrator I only have to keep the source list up to date to ensure data quality for every entered product code.

Cascaded dropdowns

As a user, I always like when a form thinks along with me. Take for example the scenario where I have to provide a shipping location by filling out country, city and building. As soon as I have provided the country I would appreciate it when I can then only pick cities that are within that country. And that as soon as I have picked the city, I only get buildings within that city. With cascading dropdowns this is possible and again it is a way to at the same time enhance data quality and provide user comfort.

Calculated fields

Calculated fields can be of use in two different ways. The first one is obvious because it is right there in the name: to perform a calculation. This can be anything from a simple total order amount calculating based on the quantity and the price to a complex calculation of break-even revenue based on the price, sales per hour, salary per hour, rent per day and overhead cost percentage. Incorporating any formula into your form again enhances data quality and provides user comfort.

The second way that calculated fields can help you in InfoPath is to provide feedback to your user. Take the previous scenario of retrieving data about a product based on the product code. It is not a bad idea to show the user all the information that was gathered based on the entered product code so he or she can check if that is indeed the product information that is needed. However, you also do not want to allow the user to change this info. If there is indeed a fault in the data, then the source needs to be adjusted to solve it for everyone. The nice thing about a calculated field, as you also know from SharePoint lists, is that they cannot be edited. So if we provide the retrieved info in calculated fields, the user gets their feedback and the admin maintain control over the data quality. Again a win-win scenario.

Infopath

So what’s the downside?

Off course InfoPath is not perfect and there will be limits to what you can do with it, but from my experience the vast majority of business processes can the greatly streamlined by pouring the data entry into smart forms build in InfoPath. And if you come across that process or trick that just isn’t doable to dummy proof it with a good form, then please ask yourself this question before going out and buying something else: “does my process really need to be this complex?”. But that’s the Operational Excellence guy in me talking.

One true disadvantage of InfoPath is that it is not included in the standard SharePoint Server offering. You have to go Enterprise[4] to get it and that is a pretty big difference in cost. And while everything has its price tag, you will to sit down and do your homework before acquiring InfoPath from an investment point of view. It will take a cost benefit analysis over multiple areas to weigh of efficiency gains to license costs before you know is the investment of upgrading from a Standard Server license to an Enterprise Server license will pay off.

However, the Office 365 environment offers multiple possibilities to go about your licensing in a whole different way that could significantly reduce the total investment needed. Plus, you get the flexibility of scaling up and down in the cloud. Our licensing experts can certainly help you figure out what would be the best plan for your organization with respect to InfoPath licensing.

Another often heard critique about forms is that they are ugly and you cannot do a lot in terms of design improvement. While InfoPath certainly is built with function over beauty in mind, there are still many possibilities to enhance the look of your forms. I would put it more like this: if you can manage to create nice excel sheets and word document, you will have the tools to present a good looking form.

To make a long story short

InfoPath is definitely not out the door yet. As Microsoft promises its users, it will be included in the latest online and on premise offerings for SharePoint and supported throughout 2026. So if you have InfoPath at your fingertips right now, use it! Build those forms and make your processes more robust and fool proof. The investment will pay itself back in the coming decade and the experience you gain from digging into the details of your processes and determining what piece of information is needed when and what is the best source to retrieve it from will be valuable forever. Because you will need to go through the same steps when building your forms in any other tool.

 

This blog post is part of the series Forms and SharePoint. More on this Topic can be found at http://08b.4d7.myftpupload.com/tag/FormsAndSharePoint/

Website based on SharePoint live at BrabantZorg

Last month, Rapid Circle launched a public website based on SharePoint at BrabantZorg. The website is fully responsive, optimised for Google and fully contemporary.   What did BrabantZorg need?

BrabantZorg is – as a committed organisation for housing, welfare and healthcare – always working on renewing itself. The purpose is to be able to be a distinctive and attractive regional provider for clients, employees and all the relevant institutions in the changing healthcare market. An important step in the development trajectory towards becoming an attractive provider was a new website.

Screenshot 2016-02-10 17.32.57
Screenshot 2016-02-10 17.32.57

What did we do?

In collaboration with the online marketing bureau Have a Nice Day Online and BrabantZorg, we built a public facing website based on SharePoint 2013. Keeping the future and the possibility to integrate processes in mind, it suited BrabantZorg for the public website to be built on the same technology, because the internal sites and processes of BrabantZorg also run on SharePoint 2013. Rapid Circle has taken care of the technical realisation and the project management.

Screenshot 2016-02-10 17.33.11
Screenshot 2016-02-10 17.33.11

The result

The website, developed in coproduction, has improved a lot when compared to the old version. The current site is fully responsive, Google optimized and entirely compliant with the guidelines and standards of these times. The most important difference between the old and the new site of BrabantZorg is that the experience of the user is much more positive due to the userfriendliness.

Would you like to know more? About public facing websites based on SharePoint or what else we can do? Please contact us!

Powershell: Publishing all files in a SharePoint Online library programmatically

One of our clients build up a library of 500+ documents. After these were modified (added meta data and the content went through several rounds of corrections), we were asked to mass publish all files so the site could go live. Which leaves us with 2 options; 1. manually check-in, publish, approve all files. 2. add some CSOM & PowerShell together in a file and do it programmatically. Off course, I, Mark Overdijk, chose to persue the second option. I asked Massimo Prota to assist in getting a script ready. The first version of the script turned out rather usefull, so I added some extra features and more out-put to & interaction with the user. This latest version is generic enough so it can be re-used.

Features

- No limitation on # of files for a list/library - Added code to filter which files should be published - User will be promoted for password and confirmation - Feedback to user on screen - If a file is checked out, the script will check in before proceeding - If Content Approval is enabled for the list/library, the script will approve the file - Screen out-put will be saved to a txt file which includes the current date/time in the filename

Prerequisites Powershell

Step 1. Gather parameters

For the script to run properly, you'll need the following parameters;

.SiteUrl: This is the full URL to the (sub)site where the list is stored for which you want to publish/approve the files

.ListName: This is the Title of the list for which you want to publish/approve the files

.UserName: This is the UserName that has enough permissions to publish/approve the files

Step 2. Run PowerShell script

Start Windows Powershell as administrator.

Be sure to first set the ExecutionPolicy correctly so you are able to run scripts. Set-ExecutionPolicy Unrestricted [ENTER] Input "A" for all After the ExecutionPolicy is set, we can run the script file.

[code language="powershell"] #################################### # Script: PublishFilesSPO.ps1 # # Version: 2.0 # # Rapid Circle (c) 2016 # # by Mark Overdijk & Massimo Prota # ####################################

# Clear the screen Clear-Host

# Add Wave16 references to SharePoint client assemblies - required for CSOM Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.dll" Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.Runtime.dll"

# Parameters # Specify the subsite URL where the list/library resides $SiteUrl = "https://DOMAIN.sharepoint.com/SUBSITE" # Title of the List/Library $ListName = "TITLE" # Username with sufficient publish/approve permissions $UserName = "USER@DOMAIN.com" # User will be prompted for password

# Set Transcript file name $Now = Get-date -UFormat %Y%m%d_%H%M%S $File = "PublishFilesSPO_$Now.txt" #Start Transcript Start-Transcript -path $File | out-null

# Display the data to the user Write-Host "/// Values entered for use in script ///" -foregroundcolor cyan Write-Host "Site: " -foregroundcolor white -nonewline; Write-Host $SiteUrl -foregroundcolor green Write-Host "List name: " -foregroundcolor white -nonewline; Write-Host $ListName -foregroundcolor green Write-Host "Useraccount: " -foregroundcolor white -nonewline; Write-Host $UserName -foregroundcolor green # Prompt User for Password $SecurePassword = Read-Host -Prompt "Password" -AsSecureString Write-Host "All files in " -foregroundcolor white -nonewline; Write-Host $ListName -foregroundcolor green -nonewline; Write-Host " on site " -foregroundcolor white -nonewline; Write-Host $SiteUrl -foregroundcolor green -nonewline; Write-Host " will be published by UserName " -foregroundcolor white -nonewline; Write-Host $UserName -foregroundcolor green Write-Host " "

# Prompt to confirm Write-Host "Are these values correct? (Y/N) " -foregroundcolor yellow -nonewline; $confirmation = Read-Host

# Run script when user confirms if ($confirmation -eq 'y') {

# Bind to site collection $Context = New-Object Microsoft.SharePoint.Client.ClientContext($SiteUrl) $credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($UserName, $SecurePassword) $Context.Credentials = $credentials

# Bind to list $list = $Context.Web.Lists.GetByTitle($ListName) # Query for All Items $query = New-Object Microsoft.SharePoint.Client.CamlQuery $query.ViewXml = " " $collListItem = $list.GetItems([Microsoft.SharePoint.Client.CamlQuery]::CreateAllItemsQuery()) $Context.Load($List) $Context.Load($collListItem) $Context.ExecuteQuery()

# Go through process for all items foreach ($ListItem in $collListItem){ # Adding spacer Write-Host " " Write-Host "/////////////////////////////////////////////////////////////" Write-Host " " # Write the Item ID, the FileName and the Modified date for each items which is will be published Write-Host "Working on file: " -foregroundcolor yellow -nonewline; Write-Host $ListItem.Id, $ListItem["FileLeafRef"], $ListItem["Modified"]

# Un-comment below "if" when you want to add a filter which files will be published # Fill out the details which files should be skipped. Example will skip all files which where modifed last < 31-jan-2015 # # if ( # $ListItem["Modified"] -lt "01/31/2015 00:00:00 AM"){ # Write-Host "This item was last modified before January 31st 2015" -foregroundcolor red # Write-Host "Skip file" -foregroundcolor red # continue # }

# Check if file is checked out by checking if the "CheckedOut By" column does not equal empty if ($ListItem["CheckoutUser"] -ne $null){ # Item is not checked out, Check in process is applied Write-Host "File: " $ListItem["FileLeafRef"] "is checked out." -ForegroundColor Cyan $listItem.File.CheckIn("Auto check-in by PowerShell script", [Microsoft.SharePoint.Client.CheckinType]::MajorCheckIn) Write-Host "- File Checked in" -ForegroundColor Green } # Publishing the file Write-Host "Publishing file:" $ListItem["FileLeafRef"] -ForegroundColor Cyan $listItem.File.Publish("Auto publish by PowerShell script") Write-Host "- File Published" -ForegroundColor Green

# Check if the file is approved by checking if the "Approval status" column does not equal "0" (= Approved) if ($List.EnableModeration -eq $true){ # if Content Approval is enabled, the file will be approved if ($ListItem["_ModerationStatus"] -ne '0'){ # File is not approved, approval process is applied Write-Host "File:" $ListItem["FileLeafRef"] "needs approval" -ForegroundColor Cyan $listItem.File.Approve("Auto approval by PowerShell script") Write-Host "- File Approved" -ForegroundColor Green } else { Write-Host "- File has already been Approved" -ForegroundColor Green } } $Context.Load($listItem) $Context.ExecuteQuery() } # Adding footer Write-Host " " Write-Host "/////////////////////////////////////////////////////////////" Write-Host " " Write-Host "Script is done" -ForegroundColor Green Write-Host "Files have been published/approved" -ForegroundColor Green Write-Host "Thank you for using PublishFilesSPO.ps1 by Rapid Circle" -foregroundcolor cyan Write-Host " " } # Stop script when user doesn't confirm else { Write-Host " " Write-Host "Script cancelled by user" -foregroundcolor red Write-Host " " } Stop-Transcript | out-null ############################## # Rapid Circle # # http://08b.4d7.myftpupload.com # ############################## [/code]

Search Result Source: Why you should use them more often

SharePoint 2013 encompasses a very fancy query configurator and "builder" in the search results web part. An admin can configure the query to filter down and/or scope the results the search web part will show. This is fantastic news as you no longer need to write SQL to provide a simple solution for client requirements like:

  • "search results should only show images"
  • "search results should display results from HR department sub site"

 

Why a result source?

The biggest issue I have with this solution, is that it is almost entirely based on the on the "Local SharePoint results" result source. Which means that the search results web part will load the complete index and then apply the query rules.

 

I wonder however, why do we put such a load on the system to show the full index and then tell it that we'll only need 2% of the results or just the *.PSD files. We can speed things up by configuring a result source, so the system will only get the results from e.g. only the *.PSD files. This will result in a performance increase because both the result source and the applied query have faster loading times. This is much appreciated considering we're applying more and more front-end design to display templates. Whenever we can save load time, we should always do so.

 

Another benefit is that if we do want to add extra filters - for example when we have a result source for one site but two search result pages (no.1 = all and no.2 = documents only) - the result source can be applied to both pages. This results in an even better performance on the 2nd page, because the performance increase is applied twice. The loaded index is smaller and the filter is applied to a smaller result set.

 

Besides all this, there are also functional benefits related to using result sources. First of all, result sources can be applied to all result web parts within the collection of creation. So, when a more experienced admin creates a result source within a site collection, other less experienced admins within that same site collection can apply the result source to result web parts.

 

A result source will also help with version control and updates. When you configure each search results web part within the web part itself completely, the code is fragmented and can't be updated from a single source or be reused. This is not the case with result sources. These can be updated from the source and the source could be a Site, a Site Collection or a Web application/Tenant.

 

So to conclude, result sources improve performance of search results web parts, they are scalable and manageable.

When to use result sources?

In theory, a result source can be used for all search result web parts. It'll be a mess however if you create a result source for all detailed searches. A simple guideline is "when a scope is smaller than a site collection, more than three site collections (including my site/search center) are present or a filter is applied". What does that mean exactly? A few examples:

 

Result Source:

  • All results from site HR including sub sites.
  • All video files from site collection "Video".

Full index:

  • All results.
  • All results from a site collection (total no. of site collection is 3).

 

Next time

In the next post I'll share the details on how to configure a result source.

For now I would like to know if you use result source or why you disagree with using them.

Rapid Circle wins again! Cloud Partner of the Year

After winning the worldwide ‘Health Partner of the Year’ award from Microsoft we also received the ‘Cloud Partner of the Year’ award for the Dutch healthcare sector. Andre Piso, Health & Local Government Lead from Microsoft Netherlands handed over the award to Wilco Turnhout.  With this annual awards Microsoft wants to express their appreciation for all Dutch partners who developed special or innovative solutions for the healthcare industry based on Microsoft technology. The Microsoft Health Partner awards are awarded in 5 categories: Innovator of the Year, System Integrator of the Year, Solution Partner of the Year, Licensing Partner of the Year en Cloud partner of the Year. We also want to congratulate the other award winners – Wortell, Caase.com, ChipSoft and Agile Software.

The reason that Rapid Circle won this ‘Cloud Partner of the Year’ award is because of our express choice for Office 365, Cloud and Microsoft Azure. Our clear ‘Cloud first’ strategy at our customers, but also in our marketing and PR.

Walkthrough: Add Geolocation column to your list in Office 365

A while ago a client (with an Office 365 E3 subscription) came to us with the wish to create a map to plot locations of external contractors on. My first thoughts, as an Office 365 consultant, went towards using the tools at hand. SharePoint 2013/Online has a Geolocation column type and the list view type "Map view". The client agreed to use this feature and I went about setting up the solution. So I posed the self-fulfilling prophecy: "How hard can it be?"...

As the list with the data was already in place, I was neither keen on letting a developer create a solution which either creates a new list with the column in it (and me migrating data) nor writing a solution which adds the column programmatically once. I wanted to add the column directly through (a reusable) script and went on to do my desk research. This ended up taking way too much time as almost all information found…

  • ...were solutions for SharePoint 2013 on premise,
  • ...were articles on the end-result,
  • ...posted failing scripts,
  • ...did not offer information on the Bing Maps key,
  • ...did not offer guides/information specifically for Office 365/SharePoint Online scenario's.

Something as simple as "what to use as the Bing Maps application URL for an Office 365 tenant?" was not to be found.

It took a while, but when I finally got the settings right for a Bing Maps key and a working script, I decided on 2 things;

  1. Create a generic script, because as a consultant I'll want to use this script more than once for multiple tenants.
  2. Write a blog post as a definitive guide to add the geolocation column type in Office 365/SharePoint Online as a resource for the community

Scenario

For the walkthrough I'm using the following scenario; As a global admin for the tenant https://yourcompany.sharepoint.com, I'm adding the geolocation column type to the list "Contact" on the sub site https://yourcompany.sharepoint.com/sites/sales and naming the column "Office".

Step 1. Get a Bing Maps Key

Go to Bings Maps Dev Center; https://www.bingmapsportal.com/

Log in with your Live account (@live.com, @outlook.com, etcetera) or create one to gain access.

Go to My account > Create or view keys

bingmapsdev1
bingmapsdev1

To create a new API key follow the "Click here to create a new key" hyperlink

bingmapsdev2
bingmapsdev2
bingmapsdev3
bingmapsdev3

Fill out the form to create your API key

. Application name: The name you would like to use for your key. It helps you to identify the key in your overview

. Application URL: The URL of your root SharePoint portal (https://tenant.sharepoint.com)

. Key type (Trial/Basic): Choose whether you're using the key for 1) a test site (max 10,000 calls p/mth and max 90-day trial period) or 2) a live site (free for max 125,000 calls p/yr) (more info here)

. Application type: What is the application? App, site,for non-profit use, etc.

In this scenario, the admin fills it out;

  • Application name: Sales Office
  • Application URL: https://yourcompany.sharepoint.com
  • Key type: Basic
  • Application Type: Public Website

After you click Create and the Captcha was filled out correctly, the page refreshes and displays your new key below. You'll receive a 64 character key.

bingmapsdev4
bingmapsdev4

Step 2. Gather required information

For the script to run properly, you'll need the following information;

. Site URL: URL to the site where the list is.

. Login account: at least admin permission as you're changing list settings.

. List Name: name of the list to add the geolocation column type.

. Column Name: title of the geolocation column.

. Bing Maps Key: to register the app and remove the notification in map view.

In this example, the admin has gathered the following info;

  • Site URL: https://yourcompany.sharepoint.com/sites/sales
  • Creds: Admin@YourCompany.onmicrosoft.com
  • List Name: Contact
  • Column Name: Office
  • Bing Maps Key: [PASTE KEY HERE]

Now we can run the script.

Step 3. Run script

Start SharePoint Online Management Shell as administrator

If you don't have SharePoint Online Management Shell, you can download it @ Microsoft Download Center

set-executionpolicy Unrestricted Clear-Host [void][System.Reflection.Assembly]::LoadWithPartialName('Microsoft.VisualBasic') <# Get User input #> $SiteURL = [Microsoft.VisualBasic.Interaction]::InputBox("Enter Site URL, example: https://yourtenant.sharepoint.com/sites/yoursite", "URL", "") $Login = [Microsoft.VisualBasic.Interaction]::InputBox("Office 365 Username, example: youradmin@yourtenant.onmicrosoft.com", "Username", "") $ListName = [Microsoft.VisualBasic.Interaction]::InputBox("List name to add Geolocation column", "ListName", "") $ColumnName = [Microsoft.VisualBasic.Interaction]::InputBox("Column name for the Geolocation column", "ColumnName", "") $BingMapsKey = [Microsoft.VisualBasic.Interaction]::InputBox("Bing Maps key", "Key", "") <# Show results #> Write-Host "/// Values entered for use in script ///" -foregroundcolor magenta Write-Host "Site: " -foregroundcolor white -nonewline; Write-Host $SiteURL -foregroundcolor green Write-Host "Useraccount: " -foregroundcolor white -nonewline; Write-Host $Login -foregroundcolor green Write-Host "List name: " -foregroundcolor white -nonewline; Write-Host $ListName -foregroundcolor green Write-Host "Geolocation column name: " -foregroundcolor white -nonewline; Write-Host $ColumnName -foregroundcolor green Write-Host "Bing Maps key: " -foregroundcolor white -nonewline; Write-Host $BingMapsKey -foregroundcolor green Write-Host " " <# Confirm before proceed #> Write-Host "Are these values correct? (Y/N) " -foregroundcolor yellow -nonewline; $confirmation = Read-Host if ($confirmation -eq 'y') { $WebUrl = $SiteURL $EmailAddress = $Login $Context = New-Object Microsoft.SharePoint.Client.ClientContext($WebUrl) $Credentials = Get-Credential -UserName $EmailAddress -Message "Please enter your Office 365 Password" $Context.Credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($EmailAddress,$Credentials.Password) $List = $Context.Web.Lists.GetByTitle("$ListName") $FieldXml = "<Field Type='Geolocation' DisplayName='$ColumnName'/>" $Option=[Microsoft.SharePoint.Client.AddFieldOptions]::AddFieldToDefaultView $List.Fields.AddFieldAsXml($fieldxml,$true,$option) $Context.Load($list) $Context.ExecuteQuery() $web = $Context.Web $web.AllProperties["BING_MAPS_KEY"] = $BingMapsKey $web.Update() $Context.ExecuteQuery() $Context.Dispose() Write-Host " " Write-Host "Done!" -foregroundcolor green Write-Host " " } else { Write-Host " " Write-Host "Script cancelled" -foregroundcolor red Write-Host " " }
The actual programming part in the script I modified from the script posted ina blog post by Albert Hoitingh. I wanted to remove the hardcoded values from the code, so the script can be run based on user inputs. So I added the interface (input boxes, confirmation, write-hosts), replaced the hard coded values and added comments.

When you run the script, PowerShell will ask the user to input the information we gathered in Step 2.

geoscript1
geoscript1
geoscript2
geoscript2
geoscript3
geoscript3
geoscript4
geoscript4
geoscript5
geoscript5

After the last values have been entered, the admin will see a confirmation screen where the values can be review and confirmed (if the input is incorrect, the script can be cancelled by entering "N" to not proceed - screenshot).

geoscript6
geoscript6

After confirmation, the admin will be prompted to enter the password.

geoscript7
geoscript7

If everything was filled out correctly, the script will run and returns with the "Done!" notification upon completion.

geoscript8
geoscript8

Return to your SharePoint online list and you'll notice when creating a new view for your list "Contact", you gained the option Map View. When checking the list settings, the column "Office" has been added of the type Geolocation.

mapview1
mapview1

Are you missing information, do you want me to clarify anything, do you want to post a conversation starter or do you just want to say thanks? Leave a comment.

Content Type Hub "Lite" in Office 365

Content Type Hubs or centrally managed content types were one of the biggest (and under appreciated) additions to SharePoint 2010. When working with several web applications and site collections (which is inevitable most of the time), keeping content types up to date over several location can become a drag and easy to make mistakes. So Microsoft added the possibility to have a centrally managed hub, were site collections and web application can subscribe to and still have local content types.

As Office 365 is based on the SharePoint 2010 framework, The Content Type Hub is available to be activated as site collection feature. But beware! On a SP 2010 server, we can activate the feature for any web app/site collection. This way you define one or more web applications/site collections as content type hub and name them appropriately.   This can not be done in Office 365. You receive 1 Content Type Hub which is by default https://[account].sharepoint.com/sites/contenttypehub and this is "by design" (in case you were wondering). Yes, you have the option in all web applications and site collections to activate the content type hub, but it will not work! You will found only after you're done with setting it all up (Technet: Configuring the Content Type Hub), creating the content types and getting ready to start the sync. The moment you are settings subscriptions, you will find out that you can only get 1 subscriptions, which is the default /sites/contenttypehub.

In short you receive the Lite version of the Content Type Hub in Office 365. So make sure when planning your Office 365 infrastructure, that you only have 1 Content Type Hub and it's available to you only at the default location. I found out the hard way, so you don't have to :)

 

Create publishing pages with PowerShell in SharePoint 2010

For a client we created an extended PowerShell script to do an One-Click deployment of their intranet. This was awesome to create because it really saved huge amounts of time creating sites & site collections, adding and activating solutions and setting permissions, navigation and master pages. But we still had to add a piece of code to create pages in the Pages Library. After a quick search, I came across this blogpost of Brendan Newell. It helped me out greatly! It just missed 1 crucial element; setting the page title. So I added this to his script. They were minor changes, so all credits go to Brendan.

PowerShell Script; # Read in list of pages from XML [xml]$pagesXML = Get-Content “pages.xml" if ($pagesXML -eq $null) { return } # Get publishing web $site = New-Object Microsoft.SharePoint.SPSite($server1) $web = $site.rootweb $pWeb = [Microsoft.SharePoint.Publishing.PublishingWeb]::GetPublishingWeb($web) # Loop through each page node to extract filename and titlename $pagesXML.Pages.Page | ForEach-Object { $fileName = [string]$_.FileName Write-Host “Creating $fileName” $titleName = [string]$_.TitleName Write-Host “Creating $titleName” # Create blank page $newPage = $pWeb.AddPublishingPage() $newPage.Update() # Update the filename to the one specified in the XML $newPage.ListItem["BaseName"] = $fileName $newPage.ListItem["Title"] = $titleName $newPage.ListItem.SystemUpdate() # Check-in and publish page $newPage.CheckIn(“”) $newPage.ListItem.File.Publish(“”); } # Dispose of the web $web.Dispose() XML Structure; <?xml version="1.0" encoding="utf-8"?> <Pages> <Page> <FileName>P1</FileName> <TitleName>Page One</TitleName> </Page> <Page> <FileName>P2</FileName> <TitleName>Page Two</TitleName> </Page> </Pages>

List view Lookup threshold in Office365

I created a list which uses 12 Person or Group fields. So I changed the resource throttling in Manage Web Applications > General Settings in Central Admin. Customer happy, so I'm happy too. Now we have to the move this list to Office365 because their intranet is moving to the cloud. But changing the resource throttling settingis not possible in Office365 (yet?). So now we have to create a new solution.

So keep in mind when working with Office365, you can't change the list view lookup threshold.

*UPDATE*; waiting for reaction Microsoft if the settings will be made available in the future

*UPDATE 20110829*; Microsoft Customer Support has confirmed that the settings will not be available as of yet, but this may change in the future...