Home .NET Wild new Roslyn: Who needs their own code analyzers and C# scripting?

Wild new Roslyn: Who needs their own code analyzers and C# scripting?

by admin

All is well in the .NET world – the platform is moving in the right direction, new technologies are being tested and getting up to speed.There’s been a lot of talk lately about .NET/ASP.NET Core, and everyone seems to have forgotten about Roslyn, which provides ample documented code handling capabilities both in the runtime and during development.
Wild new Roslyn: Who needs their own code analyzers and C# scripting?
Wild new Roslyn: Who needs their own code analyzers and C# scripting?
To fix this, we interviewed Filip W, Microsoft MVP, Roslyn contributor and just one of the world’s most popular ASP.NET bloggers. Why does Filip think changes in the new C# can go unnoticed, why write your own code analyzers, and why is scripting in C# better than any scripting language?
JUG.ru Group: Philip, let’s start with a warm-up. ASP.NET Core is changing a lot right now. How do you feel about the changes taking place as a developer working with the platform?
Filip W: Of course, the early adopters experienced a lot of problems: deadlines, versioning hell, name changes, tooling problems, inconsistent project management, and more, up to and including changes to the .NET Standard concept itself. Could it have been done better? Definitely yes, in hindsight though, everything always looks simple and straightforward.
On the whole, the changes are definitely for the better. Just think, a few years ago ASP.NET worked only on Windows and only on IIS. Plus, it was based on System.Web.dll, which added a ridiculous overhead (29Kb on average) to every HTTP request. Today, however, ASP.NET Core, if you believe benchmarks , is one of the top three to five best performing web frameworks on Linux. From this point of view, the platform has undoubtedly undergone an incredible transformation.
JUG.ru Group: So you wrote .NET applications for Linux? What is the situation with stability of such solutions? Is the platform ready for production?
Filip W: Yes, many of my new projects are now spinning in Docker on Debian-based systems. I’ve run into some problems with platform dependent code, like cryptography, or strange deadlocks here and there, but overall I’m happy with it. Well, and of course, the benefits of being able to control the entire platform via Docker Swarm are impressive.
In fact, we try to develop cross-platform .NET code regardless of which OS the project will be deployed on. As a result, most of our projects have build agents for Windows, macOS and Linux. This way, I can develop something on my Mac and be able to deploy it to Docker or Azure Web App with a guarantee that everything will work correctly.
JUG.ru Group: What about C#? In version 7.0 we will have tuples, pattern matching and many other features. Will these new features be useful for you as a developer?
Filip W: Like in C# 6.0, the changes are minor, so it’s likely that many developers in their day-to-day work won’t notice the change. Personally, I definitely find tuples useful, because the way they are currently implemented is pretty bad. Hopefully this will help to drastically reduce the number of auxiliary classes we encounter now, when to deserialize or read one or two fields from the database a developer has to create a new class.
I’m a little disappointed that the pattern-matching syntax won’t be expression-based. Instead, there will be an emphasis on is and switch, though I understand the rationality that innovations come step by step. Plus, the more expressive elements help write concise code, and C# as we know it can be very "wordy, " so such changes are also for the best.

C# vs Powershell — 1:0

JUG.ru Group: At last year’s DotNext you talked about C# scripting How do you think the report "caught on"? Is this kind of C# scripting in demand among .NET developers?
Filip W: Oh, I got an incredible response from the audience! I think scripting is one of the coolest things about Roslyn, because it opens up access to a lot of interesting usecases not previously available in the .NET ecosystem. Sure, you could use Powershell before, but being able to write scripts in C# is different, given how close it is to .NET developers. Now we can see a sharp increase in the use of C# scripts in commercial projects: Azure Functions, game extensions, and more are implemented on them. Add to that the language services and intellisense/debugging support for C#, which you can get in some lightweight editor, like VS Code, and you get a very nice development process. The funny thing is that C#, being a nonscriptive language, has gotten such a powerful environment that hardly any of the scripting languages can compete with it in terms of productivity.
In Moscow, in the discussion zone, we spent almost two hours discussing the complexities and uses of C# scripts: for security, memory management, application extensions (plugin systems based on scripts) and even remote REPL for managing executable processes. That was awesome!

Developing your own code inspections under Roslyn

JUG.ru Group: Besides scripting, you also deal with static code analysis. Tell me, who would need to develop their own analyzer considering there are VS and Resharper on the market?
Filip W: A team-own analyzer is usually needed for team-leaders who handle code review and are in charge of the team’s code quality in general. It is important to understand that we face our own issues and imperfections in the development process which are relevant only in the context of our project. Both Resharper and VS are universal tools designed for a wide audience, but what to do if you need to enforce some particular pattern or code to comply with your corporate guideline? For example, set rules for naming classes/variables, make sure that your internal API is used only as intended, that code is documented according to your standards, or that HTTP endpoints are developed according to an established standard. Sometimes there are strange things, too – I once worked on a project where tabs and #region-directives were forbidden at compiler level.
However, even if you forget about writing your own analyzer, it is important to understand how they work "under the hood". As in other areas of programming, even if you don’t have time to write your own analyzer, it is very useful to understand the principles behind them and how the compiler API works to make the code analyzer work.
Wild new Roslyn: Who needs their own code analyzers and C# scripting?
JUG.ru Group: Speaking of compiler, which Roslyn Compiler APIs have made your life easier than before?
Filip W: Is this a trick question? In fact, the old compiler didn’t let you do anything except feed it code and get DLL/EXE files in the output. So for me the most important thing about Roslyn was that it’s a true Compiler-as-a-Service, where each step of the pipelining has an external API that you can use in your own way. Also amazing is that before Roslyn there was no official C# AST library (you could only find third-party variants).
JUG.ru Group: By the way, what about backward compatibility? How likely is it that my self-written analyzer will fall apart in the next Roslyn release?
Filip W: Well, that the Roslyn team pays a lot of attention to backward compatibility, I know for sure! If you dig into the compiler code, you can find incredible examples of this. For example, if you look through the compiler’s source code you’ll find the "DELIBERATE SPEC VIOLATION" lines. What are they? It turns out that the code of the old CSC compiler, due to bugs or some misunderstanding, violates the C# specifications in some places. At the same time, the Roslyn team didn’t plan to make any changes that would break anything, and so we got a new compiler where the developers deliberately broke the C# spec in some places and documented it as a deliberate spec violation 🙂 Link
I realize that compiler backward compatibility and its API are different topics, but my example shows well the mentality of the team. I’ve done some backwards compatibility at Roslyn myself, and I can tell you that one of the most tedious aspects of code review is the amount of attention given to parsing each "public" API – precisely because it will be maintained in Roslyn for a long time. So honestly, I wouldn’t worry about backward compatibility.
JUG.ru Group: How did you come to start researching the Roslyn API in the first place? What kind of problems did you originally want to solve?
Filip W: I originally got into the Roslyn community because of a scripting project we were doing Scriptics , one of those projects that helped shape this whole C# scripting thing 10 years later. Then I got into the OmniSharp project, which adds intellisense and C# language services to editors like emacs, vim or Atom. Although of course the largest and most recognizable "consumer" of OmniSharp in the .NET community is Visual Studio Code. That’s where I started developing tools for code analysis, refactoring, and many other IDE-level language features.
JUG.ru Group: Tell me, what will happen to static code analysis in the near future? What should we expect in the next 1-3-5 years?
Filip W: I think we will see a lot of "live" diagnostics. Josh Varty, one of my friends, built the coolest addon to Visual Studio called (surprise!) Alive which runs blocks of your source code and instantly tells you how your method or your loop will work, and warns you about errors that might happen in runtime. This is all beyond static or semantic code analysis, all built on Roslyn.
So in general, in my opinion, we will encounter more and more advanced analytics, such as finding references to null through symbolic computation. For now, the community is still just sorting out the features that Roslyn provides. In addition, I hope to see tighter integration of Roslyn parsers into third-party tools such as OmniSharp or Resharper. Such analyzers already exist for Visual Studio Code, but their work is far from being perfect.
JUG.ru Group: Thank you Philip, see you at DotNext!


Filip will make a presentation. "Building code analysis tools with the .NET Compiler Platform (Roslyn)" at the upcoming conference in St. Petersburg, on the same stage as Jon Skeet, Sasha Goldshtein and other MVPs. Details about the speakers and presentations are available at DotNext 2017 Piter
P.S. As a reminder, there is early registration until the end of February, and for now you can register by saving a couple thousand

You may also like